Morality Development and Its Influence on Emotion, Attitudes, and Decision Making

Abstract

While morality research spans over decades, few papers have explored the evolution of the field, and even fewer have sought to synthesize current research. This paper seeks to fill in these gaps. This review explores how the construct of morality relates to human social behavior. We examine the development of morality by considering evidence for biological and social influences. We review Western morality theories and research techniques and include the strengths and criticisms of modern morality research methods. We conclude the paper with consideration of the social, political, and environmental impacts on moral-based emotion and decision making.

Share and Cite:

Rook, S. , Stephenson, N. , Ortega, J. , de Calvo, M. and Iyer-Eimerbrink, P. (2021) Morality Development and Its Influence on Emotion, Attitudes, and Decision Making. Psychology, 12, 1722-1741. doi: 10.4236/psych.2021.1210104.

1. Definition of Morality

The definition of morality has undergone changes in both composition and understanding. Previously considered to be a personal quality that one either possessed or not (Wendorf, 2001), morality is now understood to be a complex social construct that emerges from an individual’s effort to reconcile personal values of right and wrong against societal norms and expectations (Rest, Narvaez, Thoma, & Bebeau, 2000). As such, individual intent and agency are crucial factors in assessing moral behaviors (Feltz & May, 2017). There is a general agreement that morality presupposes the ability to 1) discriminate right from wrong, 2) act on this distinction, and 3) evidence pride in righteous behavior and experience guilt or shame in acts transgressing one’s personal standards (Quinn, Houts, & Grasser, 1994; Shaffer, 1994: p. 341). Morality is also differentiated by the engagement in active or inactive moral behaviors, and some experts assert that personal restraint is the highest form of moral behavior (Colby, Kohlberg, Speicher, Hewer, Candee, Gibbs, & Power, 1987; Gilligan, 1977).

2. Theories of Morality

As a high-level overview, this paper will provide a brief review of the theoretical framework used to guide research in this area, as well as outline how research conducted by early theorists influence the way in which morality is conceptualized today.

2.1. Biological Theories

Early biological theories of morality were based on human behaviors that appeared to violate expected evolutionary norms; most notably, the human trait of altruism. While defined as a behavior conducted to benefit or help others, a person engaging in an altruistic act may risk their own survival or safety for the benefit of another (Korsgaard, 2010). Charles Darwin attributed morality to biology as early as 1871, noting that the ability to make moral judgments is unique to humans (Hodgson, 2013). He asserts that the cross-cultural prevalence of moral behavior highlights an underlying biological capacity for moral thinking (Ayala, 2010).

Subsequent biological theories were influenced by observations in the natural world, specifically the expression of altruism in non-humans (see Tan & Hare, 2013 for bonobo studies; Leeks & West, 2019 for viral studies), human moral judgements regarding kinship and incest (Lieberman, Tooby, & Cosmides, 2003), and the recognition of strong genetic influences in moral and prosocial behavior among separated biological twins (Israel, Hasenfratz, & Knafo-Noam, 2015).

2.2. Psychoanalytic Theory

Sigmund Freud’s psychoanalytic theory suggests that morality is developed in stages and is ruled by a part of consciousness he called the superego. The superego functions as one’s moral conscience and, with help of the ego, attempts to balance our selfish desires ruled by the id (Freud, 1923). Psychoanalytic theory suggests that morality begins during a premoral period of development in which a child’s moral framework is relatively simple and largely based on avoiding punishment (Freud, 1925 as cited in Krebs, 2004). According to Freud, moral motivation for the sake of reward or a sense of correctness occurs after an adolescent internalizes cultural and parental norms (influencing the development of the superego). Once these norms have been internalized by the adolescent, their moral reasoning will result from conflict between the individual’s perfect ideology (superego) and imperfect expression of that ideology (ego). The superego is the final stage of reasoning to emerge in a developing person and still operates from a primitive desire to avoid punishment. Thus, it could be argued that psychoanalytic theory regards moral reasoning as an extrinsic process that occurs in the context of environmental transgression and punishment (Herman, 2005).

2.3. Social Learning Theory

In 1961, Albert Bandura led a study on aggression, known as the bobo doll study, that illustrated observational learning and later led to the social learning theory (Bandura, 1977). In the bobo doll study, toddlers were exposed to aggressive behavior modeled by an adult. In the aggressive condition, adult demonstrators would beat and verbally berate the clown doll. When children in the aggressive condition were given the opportunity to play with the same clown doll, 90% of them mimicked the aggressive behaviors of the demonstrators (Bandura et al., 1961).

With the understanding that the observed behaviors were mimicked and ado- pted by the participants, Bandura developed the social learning theory (Bandura, 1977) attributing cognitive factors and environmental stimuli to the development of behavior. Social learning theorists propose that moral behavior is learned from a young age through social interactions and learning experiences typically provided by the child’s family (Shaffer, 2009: p. 343).

Bandura’s Social Cognitive Theory of Moral Thought and Action (Bandura, 1991) builds on his previous research and focuses on the self-regulation of behavior, finding that “self-regulatory mechanisms play an integral part in moral agency” (Bandura, 1991). Per this theory, we develop a moral self, and our behavior necessitates following our own conception of right and wrong.

2.4. Cognitive Theory

Several researchers have attempted to organize observed patterns of moral development into universal developmental frameworks. Early work in this area (e.g. Piaget) assumed that all humans experience a progressive development of moral identity. Piaget’s theory of moral development contains three sequential stages: premoral period, heteronomous morality, and autonomous morality.

The premoral period, consisting of children, usually in their preschool years, assumes limited awareness and understanding of the rules of society surrounding them. Heteronomous morality, occurring from ages 6 to 10, assumes an outside force such as an authority figure, is revered as divine and unwavering. A hallmark of this level is a belief in immanent justice, an automatic punishment issued following a wrongful behavior regardless of the intention. Autonomous morality, occurring in children from the age of 10 to 11, involves the understanding that rules are merely agreements between individuals, derived from one’s personal motivations and beliefs. In this stage, equitable distribution of punishment and rewards (distributive justice) replaces the rigid structure of immanent justice (Huitt & Hummel, 2003: p. 2; Piaget, 1965; Shaffer, 2009: pp. 342-343).

Kohlberg expanded the framework of earlier models to include moral development occurring into adulthood. His theory propounds three moral components: preconventional, conventional, and postconventional, each containing two stages. Preconventional morality, asserts morality to be self-serving, involving a desire by children to avoid punishment (stage 1) or achieve personal rewards (stage 2). Conventional morality posits that rule obeying occurs to win the approval of others (stage 3) or to maintain social order (stage 4). Kohlberg’s final stage, postconventional, is developed by only 10% - 15% of adults. This level is characterized by an individual’s desire for laws to universally benefit all (stage 5) and the engagement in ideal moral reasoning (stage 6), where morality is based on individual ethical principles dictated by ones conscience (McLeod, 2013).

3. Development of Morality

Morality is the socio-cultural concept of right and wrong (Rest, Narvaez, Thoma, & Bebeau, 2000). It develops in reaction to and in cooperation with environmental factors (Eskine et al., 2011), supervision (Migliore et al., 2018; Zhu et al., 2015), cultural values (Huppert et al., 2019), and religious beliefs (Jinpa, 2016; Purzycki, Henrich, Apicella, Atkinson, Baimel, Cohen, & Norenzayan, 2018).

Moral traits are evidenced in humans by early childhood. Young children demonstrate rudimentary moral foundations with the ability to recognize emotions in others and employ fairness in their interactions (Killen & Smetana, 2015). Adherence to moral behavior is not required for recognition of moral reasoning. The responses young children provide to moral dilemmas frequently indicate that children believe retaliation is a good reason to behave immorally (Malti, Gasser, & Buchmann, 2009). By their teenage years, adolescents can formulate much more complicated moral reasoning regarding provocation and aggressive behavior (Arsenio, Adams, & Gold, 2009).

To understand the universal framework from which morality operates, we will briefly cover empathy.

Development of Empathy

The psychological mechanism for recognizing empathy starts very early. Research measuring the electrical impulses in the brains of newborn infants indicates that the ability to distinguish (and exhibit a preference for specific) emotional content in human vocalizations develops within a few days after birth (Decety & Howard, 2013). Empathic reasoning can be observed in helping behaviors and expressions displayed by very young children (Killen & Smetana, 2015).

Empathy has been observed in humans cross-culturally. A study by Huppert et al. (2019), conducted in thirteen countries, found that children (n = 2163) follow similar moral strategies for sharing resources. Young children in the study showed a universal preference for equally sharing resources, independent of their cultural background. Older participants’ attitudes indicated an age-related shift to prefer need-based sharing of resources. While all the children developed these attitude changes at the same rate and in the same order, children from individualistic cultures favored equitable behavior more often in dilemmas relating to wealth and merit, whereas children from collectivistic cultures favored equity in dilemmas relating to empathy (Huppert et al., 2019).

As children continue their cognitive and emotional development, they are better able to express empathy in various perspective taking situations. Empathy continues to evolve throughout the lifespan despite its complexity and influence by genetic, neurological, temperamental, and socialization factors (McDonald & Messinger, 2011). The manner in which one’s empathy develops greatly impacts and promotes prosocial behavior, altruistic behavior, and the internalization of societal rules.

The internalization of societal rules has been recognized as a strong precursor to learning right from wrong. A 2005 study by Aksan and Kochanska (2005) assessed how a child’s (33 to 45 month old) empathy level would influence their rule-following and helping behaviors. The results indicate that children who experienced more guilt (as measured by the child being led to believe that they personally caused harm to another person), as well as those who experienced empathic distress (as measured by physical and verbal gestures after seeing a stranger drop a box on their foot), were more likely to follow rules given to them even when they were unsupervised. These findings indicate that empathy acts as a motivator to accept rules issued by an authority figure, particularly when the consequences of not following those rules are negative in nature and modeled by others. This research emphasizes the importance of empathy as a means to connect with others.

4. Morality Research Methods, Criticism and Limitations

Cross-cultural exploration of morality is prolific (Brennan & Houde, 2017; Marquette, 2012; Moheghi, Ghorbanzadeh, & Abedi, 2020). In America, initial methods of inquiry relied on philosophical reasoning. These methods persisted well into the end of the 19th century (see Wendorf (2001) for a complete review of morality prior to the 20th century).

The beginning of the 20th century brought the first structured morality measures, spurred by the development of intelligence testing by John Binet and closely aligned with personality differences (Wendorf, 2001). Like most psychological tests used at the turn of the century, these measures tended to be culturally and socioeconomically biased. They did however provide an early framework for more robust empirical methods (Guthrie, 2004; Kaplan & Saccuzzo, 2017; Leming, 2008; Wendorf, 2001).

4.1. Piaget

John Piaget’s Moral Development Scale (MDS) forms the basis of modern, Western moral development measurement, and assesses individual moral development along a continuum of possible expressions (Kurtines & Pimm, 1983). Employing a combination of morally themed stories, structured interviews, and play observation, Piaget’s theory is regarded as the beginning of what is the present- day moral development theory. While Piaget’s work was considered seminal, regards for his theory came as a result of his efforts to understand the dynamic methods by which morality is obtained and expressed in children (Piaget, 1965; Wendorf, 2001). Despite this, critics of Piaget argue that his model relied too heavily on genetic expression (Lickona, 1969) and that his stage theory was culturally (Eurocentric) biased (Weinreich, 1975). Piaget’s work on moral development is further criticized for its lack of focus on adulthood. His assumption asserts that moral development is completed in childhood and thus his theory fails to offer an understanding of moral development as one ages (Lourenço & Machado, 1996).

4.2. Kohlberg

Lawrence Kohlberg built on Piaget’s MDS with his own stage theory. Expanding the scope of moral development to reflect the possibility of moral growth in adulthood (Weinreich, 1975), Kohlberg used moral dilemmas and a formal psychological interview to obtain and score moral development (Colby et al., 1987) over the lifespan. Although not reliant on genetic inheritance like Piaget, Kohlberg’s system still exhibited gender and cultural bias. His theory’s higher levels of moral attainment tended to rely on moral decisions that traditionally engendered masculinely (Shaffer, 2009).

4.3. Eisenberg

Nancy Eisenberg expanded the scope of moral development to include the development and experience of moral emotions (Eisenberg, 2000; Eisenberg, Cumberland, Guthrie, Murphy, & Shepard, 2005). Eisenberg conducted her research using moral dilemmas, often including observational components in order to better understand the circumstances under which moral judgment and prosocial behavior are associated (Eisenberg & Shell, 1986). Unlike her predecessors, her work also aimed to include a cross-cultural context for moral development (Carlo, Koller, Eisenberg, Da Silva, & Frohlich, 1996).

4.4. Rest

James Rest reworked Kohlberg’s moral stage format into a quantitative comparison of multiple factors that collectively predict moral behavior (Rest, Narvaez, Thoma, & Bebeau, 2000; Rose, 2012). Called the Defining Issues Test (current version is the DIT-2), Rest’s measure allows comparison of scores in multiple morality-related areas without subjecting those scores to hierarchical comparisons. Multiple versions of the DIT2 are available, including survey-type measures and online surveys that can be completed by participants without the aid of a researcher (Xu, Iran-Nejad, & Thoma, 2007).

4.5. Moral Foundations Theory

Intended to be a universal theory of morality, Moral Foundations Theory (MFT) is based on cognitive social models and attempts to account for a broad array of moral attitudes, values, and behaviors. The theory is based on four principles of human morality: 1) that all humans are predisposed to develop moral views, 2) that the manifestation of native morality is shaped by the culture in which an individual develops and lives, 3) that moral inclinations occur quickly when prompted and often rely on heuristics, and 4) that multiple moral foundations underlie the expression of human morality (Graham et al., 2013).

While MFT recognizes multiple primary moral foundations, the five most frequently researched foundations are Care/Harm, Fairness/Cheating, Loyalty/ Betrayal, Authority/Subversion and Purity/Degradation (Graham et al., 2013; Graham et al., 2018).

5. Are Current Morality Measures Biased?

5.1. Stage Theory

Kohlberg’s stages were conceptualized using Western population samples, despite his insistence that they reflected universal human development (Moheghi, Ghorbanzadeh, & Abedi, 2020). The most well-known critique of Kohlberg’s theory came from Carol Gilligan, who argued that Kohlberg’s model was biased towards males. Kohlberg developed his theory using interviews primarily conducted with male participants. His collected data indicated that while males generally reasoned at stage 4 of conventional morality, females never reasoned past stage 3. In response to this finding, Gilligan began exploring ways to assess gender differences in moral development. While her research led to several interesting discoveries, her earlier work indicated a cultural socialization of men to be more independent than females. This finding suggests a motivation by men to view moral dilemmas as a conflict between two or more parties that can be resolved using established laws. This resolution is characteristic of those in stage 4 of Kohlberg’s model. Conversely, Gilligan argues that females are raised to be more caring and focused on the welfare of others. This resolution, however, rigidly places them in stage 3 of Kohlberg’s theory. Given these findings, Gilligan’s work strongly supports a gender difference in moral expression, and a need to find a more neutral and comprehensive method to study morality (Shaffer, 2009: p. 349).

5.2. Defining Issues Test

The DIT-2 seems to exhibit high reliability when administered across several culturally diverse groups. Regardless, criticisms regarding the face validity of the test has spurred much debate. A meta-analysis by Rose (2012) found differences in scoring based on university attendance. University students tend to score higher in moral reasoning, suggesting a potential educational component in increased scores. Interestingly, this study also found that students attending Christian universities scored at an average or below average rate compared to their non-religious university counterparts. Regardless of these findings, proctored (paper and pencil) and unsupervised (online) survey models produce comparable reliability and validity scores, making the measure much more time-effective than traditional interview research methods (Xu, Iran-Nejad, & Thoma, 2007).

5.3. Moral Foundations Questionnaire

Based on the Moral Foundations Theory, the Moral Foundations Questionnaire is a survey based on five domains: Harm, Fairness, Ingroup, Authority, and Purity. Most criticism is directed at the underlying theory supporting the measure. While critics questions whether the focus of only these five domains in the absence of others is most appropriate (Clifford, Iyengar, Cabeza, & Sinnott-Arm- strong, 2015), the primary authors of the theory continue to support the use of them as the best predictor of moral behavior (Graham et al., 2018).

The Moral Foundations Questionnaire is one of the most thoroughly validated moral measurements currently in use. Pearson correlations of the MFQ’s test- retest reliability have been moderate, ranging from .68 - .82 (Graham, Nosek, Haidt, Iyer, Koleva, & Ditto, 2011).

The MFQ survey can identify patterns of moral attitudes across diverse population samples. While all the domains are well-validated (Graham, Nosek, Haidt, Iyer, Koleva, & Ditto, 2011), it is important to note, that the Purity scale only achieved the same predictive strength as the other domains with the inclusion of an item asking about personal religious attendance. This weakness is most evident with sampling from non-religious populations. A 2017 study found poor comparative fit indices for purity (CFI = .50) and authority (CFI = .64) domains when comparing results from religious and non-religious participants (Davis, Dooley, Hook, Choe, & McElroy, 2017).

The MFQ has also been translated to multiple languages and shows a high degree of validity in cross-cultural assessments, including Brazil (Moreira, Souza, & Guerra, 2019), China (Du, 2019), and Muslim populations in Turkey (Yilmaz, Harma, Bahçekapili, & Cesur, 2016).

Despite the care taken to document the reliability and validity of the entire MFQ scale, there has been a trend of some Western researchers to use shortened versions with an increased weighted value placed on religious ideation (Franks & Scherr, 2015). Recent systematic analysis of the MFQ has supported the validity and reliability of the official measure and has shown that the two most reliable forms of the scale are the 30 and 40 item measures (Tamul, Elson, Ivory, Hotter, Lanier, Wolf, & Martinez-Carrillo, 2020).

6. Morality & Politics

Smith, Alford, Hibbing, Martin, and Hatemi (2017) conceptualize scores on moral foundation component areas to be akin to stable dispositional personality traits. Based on this assumption, they hypothesized that changes in moral foundations could predict changes in political orientation. While their results did not support this hypothesis, their findings further support the idea that differences in MFQ scores may be the result of different approaches used by liberals and conservatives to reason moral dilemmas (Emler, Renwick, and Malone 1983). The five moral foundations from the Moral Foundations Theory are stratified into subsections referred to as binding or individualizing foundations (Graham et al., 2013 as cited in Franks & Scherr, 2015).

In an attempt to parse out the relationship between sociopolitical orientation and morality, Federico, Weber, Ergun, & Hunt (2013), suggested that only specific sociopolitical attitudes map on to foundational component areas. Their research found that individualizing foundations are consistent with beliefs preferring equality over inequality, and binding foundations are associated with preferences for openness versus social conformity (Federico et al., 2013). Different moral foundations are often associated with different political philosophies. For instance, those who score high in valuing purity are typically more closely aligned with conservative candidates, while liberals aligned with the moral foundation of fairness (Franks & Scherr, 2015). Endorsements of binding foundations predicted support for conservative candidates, whereas individualizing foundations showed consistent support for liberal candidates (Franks & Scherr, 2015; Federico et al., 2013).

7. Morality & Personality

Researchers have found that personality traits such as optimism, conscientiousness, and openness to experience are strong predictors of certain moral decision-making and public moral behaviors (e.g., physical distancing behavior during a global pandemic). Positive affect (mood) is associated with increased moral behavior, whereas negative affect predicts less moral behavior (Alivernini et al., 2020). Other personality traits like anger and spitefulness, have been notated in current literature to be negatively associated with moral behavior. Nocera et al. (2021) found a statistically significant relationship between online aggression and trait-anger, which was mediated by moral disengagement. A 2020 study suggests that spitefulness is not related to moral binding values (ingroup loyalty, purity, respect for authority, etc.) and is negatively correlated with individualizing traits (sensitivity to harm, fairness; Zeigler-Hill et al., 2015).

The collective presence of psychopathy, narcissism, and Machiavellianism (the Dark Triad) have gained more attention in recent years. Expressing high levels of the Dark Triad have been found to negatively correlate with moral decision- making (Karandikar et al., 2019) and positively correlate with symbolic behaviors designed to communicate personal morality for the purpose of individual benefit (r[592] = .14, p < .001; Ok et al., 2021). Francis et al. (2017) found that when moral dilemmas were presented in virtual reality (with a realistic, visual representation of the dilemma) and participants were required to perform a specific action to make their moral decision (e.g. pulling a lever), participants’ individual levels of psychopathy predicted the intensity of physical force used to perform that action. This relationship was only noted when the choice resulted in harmful consequences (n = 40, rs (18) = .45, p < .05; 2017). In a three-year longitudinal study, Sijtsema et al. (2019) found that moral disengagement, antisocial behavior and the Dark Triad moderate each other’s developments indirectly (b = .60, SE = .21, p < .01; b = .47, SE = .23, p < .05).

Correlations between dark personality traits and immorality (Francis et al., 2017; Karandikar et al., 2019; Ok et al., 2021; Sijtsema et al., 2019) are not strictly linear or causal. A causal relationship may be impossible to infer within current experimental terms, as a disregard for morality (e.g. moral disengagement) is one of the main factors assessed in the psychometric measurement of Machiavellianism (Williams et al., 1975). We believe this highlights how essential morality is to the concept of human personality.

8. Morality & Attitudes/Beliefs

Moral character can affect an individual’s motivation both directly via group identity (Ellemers, 2018) and indirectly, through implicit judgments (Dweck, Chiu, & Hong, 1995). An individual’s worldview strongly influences their perception of morality, particularly as it applies to assumptions of fixed or growth personality traits (Hughes, 2015). Research indicates that individual moral assessment is highly fluid and suggestible by environmental context. Eskine et al. (2011) conducted a study to explore whether strong flavors (bitter or sweet) could influence moral disgust. In their small sample of college undergraduates (n = 54), they found that individual judgments on moral transgressions were moderated by the flavors participants were primed with. Bitter flavors primed individuals to react with stronger moral disgust to vignettes featuring moral transgressions (e.g. stealing, lying) as compared to control participants exposed to sweet or neutral tastes (Eskine et al., 2011).

Moral character was found to influence leader-follower relationships in a Romanian organization sample (Zhu et al., 2015). According to Zhu and colleagues (2015), group members with a strong sense of morality were likely to share a greater relational affiliation with group leaders and the organization when group leaders exhibited strong moral character.

In addition to strengthening in-group relations, moral-based group affiliation can act as a catalyst to polarize negative affect against out-group members (Parker & Janoff-Bulman, 2013). The perception of morality also directly influences an individual’s attitudes. A large assessment conducted across the United States and Canada (n = 1252 adults) asked participants to describe the morality of everyday events (Hofmann, Wisneski, Brandt, & Skitka, 2014). Participants were polled multiple times each day for several days and asked whether they experienced or observed (directly or indirectly) actions that participants judged as moral. From participant responses, Hofmann et al. (2014) estimated the percentage of everyday events with moral implications to average about 28.9%, with the remainder of everyday events not being morally relevant.

9. Moral Decision Making

While most individuals consider themselves to be moral people (Lu & Chang, 2011; Perugini & Leone, 2009), they frequently make immoral decisions (Jordan et al., 2015; Perugini & Leone, 2009; Xu et al., 2019). Engaging in immoral behavior threatens an individual’s moral self-concept, sometimes prompting them to perform a counterbalancing moral act (Jordan et al., 2015). If everyone believes themselves to be moral, when and why do they make immoral decisions?

Research suggests that an individual is more likely to behave immorally in a high-stakes situation, especially when there is a low chance of being caught. In a series of studies, Ruedy et al. (2013) examined the emotional experience of moral and immoral behavior. Ruedy first asked participants to rate their actual and anticipated feelings to several immoral scenarios. These included billing more hours to an employer to get a monetary bonus or having the ability to cheat on a test by overreporting the number of questions that were answered correctly (study 1). Ruedy’s subsequent studies were largely applications of the scenarios from the first study, including opportunities to easily copy another’s work or to cheat on a set of tests without monetary compensation (Ruedy et al., 2013). In these studies, most participants cheated when the opportunity arose and described a positive emotional experience connected to the thrill of cheating.

In a 2019 study by Xu et al., participants (n = 87) self-reported their moral identity before and after cheating. The participants were invited to participate in a dice-guessing paradigm under the guise of researching human predictive ability. All participants answered questions regarding their moral self-identity before being given the opportunity to anticipate the result of rolling a 6-sided die. Possible results were coded as “high” (numbers 4 - 6) and “low” (numbers 1 - 3), ensuring any single participant could be expected to choose the correct answer 50% of the time. In one condition, participants declared their prediction (high versus low) prior to the die roll. In the experiment condition, participants were asked to report their prediction after the die roll. The participants who had an opportunity to cheat reported their levels of prediction-accuracy at 64% out of 100%, suggesting a slight willingness to cheat. Participants repeated the moral identity questionnaire after being given the opportunity to engage in immoral behavior; results indicated that the opportunity to cheat (whether cheating occurred or not) moderated a decrease in self-perceived morality (t = 2.89, p = .005; Xu et al., 2019). Other researchers suggest that “falling short” of personal moral ideals causes some people to seek out opportunities to display moral behavior to restore their ideal self-perception (Jordan et al., 2015).

Thought suppression, or one’s deliberate attempts to stop thinking about a target behavior (Wegner et al. 1987; Wegner, 2009 as cited in Yam, 2018), is a coping mechanism sometimes used to avoid immoral behavior. Past research has suggested that thought suppression may have adverse effects, such as ruminating on the thoughts one is trying to suppress. A 2015 study conducted by Yam finds support for a relationship between thought suppression and moral behavior. Interestingly, the main predictor of this behavior is the content that is being suppressed. Participants who suppressed ethics-related content were found to act with more moral integrity. When the content being suppressed has nothing to do with ethics, moral behavior decreases during and after thought suppression.

Social environments can also affect moral decision-making. Susewind and Walkowitz (2020) found that social recognition of prosocial acts led to less subsequent helpful behavior compared to prosocial acts that were unrecognized (e.g. anonymously donating to charity). In other words, when an altruistic act is witnessed, an individual is less likely to follow up with another similar action immediately. However, if the act is not witnessed, one is more likely to repeat that behavior for social recognition when given the opportunity. Other researchers have argued that frequently engaging in moral behavior, regardless of the outcome, can strengthen an individual’s moral self-concept (Borhani, Keshtgar, & Abbaszadeh, 2015; Jordan et al., 2015).

In a 2018 study by Migliore et al., researchers hypothesized that the proximity of authority figures (deontological morality) and victims (empathic morality) significantly affects moral decision-making. They found that moral decision-making is impacted by observation. In empathic conditions, participants made faster decisions to act morally; when observed by authority figures, responses slowed. Both empathic and deontological conditions, however, yield more utilitarian answers than moral scenarios with no observation (Migliore et al., 2018).

10. Morality & Ethics

Ethics are notoriously difficult to define in a framework that is different from morality (Yam, 2018; Zhu et al., 2015). Ethics frequently appears in contexts of business and war scenarios (Noval & Stahl, 2017) and may best be considered as a form of prosocial conduct that is framed as an ideal example of behavior (Walsh, 2015). Morality, with its dilemmas and vignettes, then becomes the personal struggle between an ideal and an imperfect reality (Eskine et al., 2011; Korsgaard, 2010).

Arvanitis (2017) argues that a personal need for autonomy may be the driving motivational force that causes humans to internalize (and thereby compromise) ethical ideals into personal moral norms. Sy et al. (2020) examined competing ethical motivation in their study of illicit drug use in the Philippines. Although illegal, many of the drugs cited in their study are used as a mechanism for increased productivity and job performance—two elements of a work ethic that are widely endorsed by Philippine culture. Even so, the Philippine government’s “war” on illicit drug use is estimated to have caused in excess of 9000 human deaths within 18 months due to violence between rival drug dealers or drug dealers and law enforcement (Simangan, 2018 as cited in Sy et al., 2020). This provides an interesting ethical dilemma: Despite the arguable immorality of deadly violence, illicit drug use appears to present a greater ethical or transgression than the deaths of citizens who break the law (Sy et al., 2020).

11. Morality & Religion

Morality is often considered in compliment to religion (Jinpa, 2016; Purzycki, Henrich, Apicella, Atkinson, Baimel, Cohen, & Norenzayan, 2018) and in some instances the traits are researched without distinction between them (Franks & Scherr, 2015; McKay & Whitehouse, 2015). However, it has been observed that morality is often demonstrated independently of religious belief (Davis, Dooley, Hook, Choe, & McElroy, 2017), with higher expressions of morality (i.e. universal, trans-cultural morality, as described by Kohlberg’s advanced stages; Colby et al., 1987) challenging religious doctrine (e.g. sexual promiscuity or homosexuality; Saroglou, 2019). Worth noting, is that conservative beliefs regarding moral sexual behavior are often a marker of religion-oriented moral concepts (Haidt & Hersh, 2001; Helzer & Pizarro, 2011; Uhlmann, Poehlman, Tannenbaum, & Bargh, 2011).

Due to common misconceptions that religion and morality are inseparable, atheists are commonly stereotyped as immoral people, even by other atheists (Wright & Nichols, 2014). It is likely that this globally observed anti-atheist bias results from the prolific belief that moral behavior will only be performed regularly with constant fear of punishment, (Gervais, 2014) and that an immoral act would likely be performed by an atheist (who is not afraid of divine punishment) (Gervais, Xygalatas, McKay, Van Elk, Buchtel, Aveyard et al., 2017; Simpson, McCurrie, & Rios, 2019). Although atheists do not believe in an all-powerful, all-punishing god, recent evidence (n = 656) suggests there is no difference in moral judgement between atheist and theist individuals (Rabelo & Pilati, 2021).

12. Conclusion

Morality has a rich and complicated history. While many have spent time engaging in the study of morality, and moral development, few have found a precise, inclusive way to conceptualize it. Our review suggests these intricacies are further complicated by social, biological, cultural, and cognitive underpinnings. Current work, however, highlights the considerable strides that have been made in this field. The desire to expand our understanding of morality in topics like religion, politics, decision making, and ethics, have challenged researchers and allowed an increased nuanced understanding of the topic. While morality has been the focus of decades upon decades of research, it is safe to say, the considerable advancement made, are rivaled only by the amount of work yet to be done.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] Aksan, N., & Kochanska, G. (2005). Conscience in Childhood: Old Questions, New Answers. Developmental Psychology, 41, 506-516.
https://psycnet.apa.org/doi/10.1037/0012-1649.41.3.506
https://doi.org/10.1037/0012-1649.41.3.506
[2] Alivernini, F., Manganelli, S., Girelli, L., Cozzolino, M., Lucidi, F., & Cavicchiolo, E. (2020). Physical Distancing Behavior: The Role of Emotions, Personality, Motivations, and Moral Decision-Making. Journal of Pediatric Psychology, 46, 15-26.
https://doi.org/10.1093/jpepsy/jsaa122
[3] Arsenio, W. F., Adams, E., & Gold, J. (2009). Social Information Processing, Moral Reasoning, and Emotion Attributions: Relations with Adolescents’ Reactive and Proactive Aggression. Child Development, 80, 1739-1755.
https://doi.org/10.1111/j.1467-8624.2009.01365.x
[4] Arvanitis, A. (2017). Autonomy and Morality: A Self-Determination Theory Discussion of Ethics. New Ideas in Psychology, 47, 57-61.
https://doi.org/10.1016/j.newideapsych.2017.06.001
[5] Ayala, F. J. (2010). The Difference of Being Human: Morality. Proceedings of the National Academy of Sciences of the United States of America, 107, 9015-9022.
https://doi.org/10.1073/pnas.0914616107
[6] Bandura, A. (1977). Social Learning Theory. Prentice Hall.
[7] Bandura, A. (1991). Social Cognitive Theory of Moral Thought and Action. In W. M. Kurtines, & J. L. Gewirtz (Eds.), Handbook of Moral Behavior and Development, Vol. 1. Theory; Vol. 2. Research; Vol. 3. Application (pp. 45-103). Lawrence Erlbaum Associates, Inc.
[8] Bandura, A., Ross, D., & Ross, S. A. (1961). Transmission of Aggression through the Imitation of Aggressive Models. Journal of Abnormal and Social Psychology, 63, 575-582.
https://doi.org/10.1037/h0045925
[9] Borhani, F., Keshtgar, M., & Abbaszadeh, A. (2015). Moral Self-Concept and Moral Sensitivity in Iranian Nurses. Journal of Medical Ethics and History of Medicine, 8, 4.
[10] Brennan, J. F., & Houde, K. A. (2017). History and Systems of Psychology. Cambridge University Press. https://doi.org/10.1017/9781316827178
[11] Carlo, G., Koller, S. H., Eisenberg, N., Da Silva, M. S., & Frohlich, C. B. (1996). A Cross-National Study on The relations among Prosocial Moral Reasoning, Gender Role Orientations, and Prosocial Behaviors. Developmental Psychology, 32, 231-240.
https://doi.org/10.1037/0012-1649.32.2.231
[12] Clifford, S., Iyengar, V., Cabeza, R., & Sinnott-Armstrong, W. (2015). Moral Foundations Vignettes: A Standardized Stimulus Database of Scenarios Based on Moral Foundations Theory. Behavior Research Methods, 47, 1178-1198.
https://doi.org/10.3758/s13428-014-0551-2
[13] Colby, A., Kohlberg, L., Speicher, B., Hewer, A., Candee, D., Gibbs, J., & Power, C. (1987). The Measurement of Moral Judgement: Volume 2, Standard Issue Scoring Manual (Vol. 2). Cambridge University Press.
[14] Davis, D. E., Dooley, M. T., Hook, J. N., Choe, E., & McElroy, S. E. (2017). The Purity/Sanctity Subscale of the Moral Foundations Questionnaire Does Not Work Similarly for Religious versus Non-Religious Individuals. Psychology of Religion and Spirituality, 9, 124-130. https://doi.org/10.1037/rel0000057
[15] Decety, J., & Howard, L. H. (2013). The Role of Affect in the Neurodevelopment of Morality. Child Development Perspectives, 7, 49-54. https://doi.org/10.1111/cdep.12020
[16] Du, J. (2019). Validation of the Moral Foundations Questionnaire with three Chinese Ethnic Groups. Social Behavior and Personality: An International Journal, 47, 1-12.
https://doi.org/10.2224/sbp.8009
[17] Dweck, C. S., Chiu, C. Y., & Hong, Y. Y. (1995). Implicit Theories and Their Role in Judgments and Reactions: A Word from Two Perspectives. Psychological Inquiry, 6, 267-285. https://doi.org/10.1207/s15327965pli0604_1
[18] Eisenberg, N. (2000). Emotion, Regulation, and Moral Development. Annual Review of Psychology, 51, 665-697. https://doi.org/10.1146/annurev.psych.51.1.665
[19] Eisenberg, N., & Shell, R. (1986). Prosocial Moral Judgment and Behavior in Children: The Mediating Role of Cost. Personality and Social Psychology Bulletin, 12, 426-433.
https://doi.org/10.1177/0146167286124005
[20] Eisenberg, N., Cumberland, A., Guthrie, I. K., Murphy, B. C., & Shepard, S. A. (2005). Age Changes in Prosocial Responding and Moral Reasoning in Adolescence and Early Adulthood. Journal of Research on Adolescence, 15, 235-260.
https://doi.org/10.1111/j.1532-7795.2005.00095.x
[21] Ellemers, N. (2018). Morality and Social Identity. In M.van Zomeren, & J. F. Dovidio (Eds.), The Handbook of the Human Essence. Oxford University Press.
https://doi.org/10.1093/oxfordhb/9780190247577.013.5
[22] Emler, N., Renwick, S., & Malone, B. (1983). The Relationship between Moral Reasoning and Political Orientation. Journal of Personality and Social Psychology, 45, 1073-1080.
https://doi.org/10.1037/0022-3514.45.5.1073
[23] Eskine, K. J., Kacinik, N. A., & Prinz, J. J. (2011). A Bad Taste in the Mouth: Gustatory Disgust Influences Moral Judgment. Psychological Science, 22, 295-299.
https://doi.org/10.1177/0956797611398497
[24] Federico, C. M., Weber, C. R., Ergun, D., & Hunt, C. (2013). Mapping the Connections between Politics and Morality: The Multiple Sociopolitical Orientations Involved in Moral Intuition. Political Psychology, 34, 589-610. https://doi.org/10.1111/pops.12006
[25] Feltz, A., & May, J. (2017). The Means/Side-Effect Distinction in Moral Cognition: A Meta-Analysis. Cognition, 166, 314-327.
https://doi.org/10.1016/j.cognition.2017.05.027
[26] Francis, K. B., Terbeck, S., Briazu, R. A., Haines, A., Gummerum, M., Ganis, G., & Howard, I. S. (2017). Simulating Moral Actions: An Investigation of Personal Force in Virtual Moral Dilemmas. Scientific Reports, 7, Article No. 13954.
https://doi.org/10.1038/s41598-017-13909-9
[27] Franks, A. S., & Scherr, K. C. (2015). Using Moral Foundations to Predict Voting Behavior: Regression Models from the 2012 U.S. Presidential Election. Analyses of Social Issues and Public Policy, 15, 213-232. https://doi.org/10.1111/asap.12074
[28] Freud, S. (1923). The Ego and the Id. The Standard Edition of the Complete Psychological Works of Sigmund Freud, Volume XIX (1923-1925): The Ego and the Id and Other Works (pp. 1-66).
[29] Gervais, W. M. (2014). Everything Is Permitted? People Intuitively Judge Immorality as Representative of Atheists. PLoS ONE, 9, e92302.
https://doi.org/10.1371/journal.pone.0092302
[30] Gervais, W. M., Xygalatas, D., McKay, R. T., Van Elk, M., Buchtel, E. E., Aveyard, M. et al. (2017). Global Evidence of Extreme Intuitive Moral Prejudice against Atheists. Nature Human Behaviour, 1, Article No. 0151. https://doi.org/10.1038/s41562-017-0151
[31] Gilligan, C. (1977). In a Different Voice: Women’s Conceptions of Self and of Morality. Harvard Educational Review, 47, 481-517.
https://doi.org/10.17763/haer.47.4.g6167429416hg5l0
[32] Graham, J., Haidt, J., Koleva, S., Motyl, M., Iyer, R., Wojcik, S. P., & Ditto, P. H. (2013). Moral Foundations Theory: The Pragmatic Validity of Moral Pluralism. In Advances in Experimental Social Psychology (Vol. 47, pp. 55-130). Academic Press.
https://doi.org/10.1016/B978-0-12-407236-7.00002-4
[33] Graham, J., Haidt, J., Motyl, M., Meindl, P., Iskiwitch, C., & Mooijman, M. (2018). Moral Foundations Theory: On the Advantages of Moral Pluralism over Moral Monism. In K. Gray, & J. Graham (Eds.), Atlas of Moral Psychology (pp. 211-222). The Guilford Press.
[34] Graham, J., Nosek, B. A., Haidt, J., Iyer, R., Koleva, S., & Ditto, P. H. (2011). Mapping the Moral Domain. Journal of Personality and Social Psychology, 101, 366-385.
https://doi.org/10.1037/a0021847
[35] Guthrie, R. V. (2004). Even the Rat Was White: A Historical View of Psychology. Pearson Education.
[36] Haidt, J., & Hersh, M. A. (2001). Sexual Morality: The Cultures and Emotions of Conservatives and Liberals. Journal of Applied Social Psychology, 31, 191-221.
https://doi.org/10.1111/j.1559-1816.2001.tb02489.x
[37] Helzer, E. G., & Pizarro, D. A. (2011). Dirty Liberals! Reminders of Physical Cleanliness Influence Moral and Political Attitudes. Psychological Science, 22, 517-522.
https://doi.org/10.1177/0956797611402514
[38] Herman, W. E. (2005). Values Acquisition and Moral Development: An Integration of Freudian, Eriksonian, Kohlbergian and Gilliganian Viewpoints. Online Submission.
[39] Hodgson, G. M. (2013) The Enduring Relevance of Darwin’s Theory of Morality. BioScience, 63, 513-514. https://doi.org/10.1525/bio.2013.63.7.2
[40] Hofmann, W., Wisneski, D. C., Brandt, M. J., & Skitka, L. J. (2014). Morality in Everyday Life. Science, 345, 1340-1343. https://doi.org/10.1126/science.1251560
[41] Hughes, J. S. (2015). Support for the Domain Specificity of Implicit Beliefs about Persons, Intelligence, and Morality. Personality and Individual Differences, 86, 195-203.
https://doi.org/10.1016/j.paid.2015.05.042
[42] Huitt, W., & Hummel, J. (2003). Piaget’s Theory of Cognitive Development. In Educational Psychology Interactive. Valdosta State University.
[43] Huppert, E., Cowell, J. M., Cheng, Y., Contreras‐Ibáñez, C., Gomez‐Sicard, N., Gonzalez-Gadea, M. L., Huepe, D., Ibanez, A., Lee, K., Mahasneh, R., Malcolm‐Smith, S., Salas, N., Selcuk, B., Tungodden, B., Wong, A., Zhou, X., & Decety, J. (2019). The Development of Children’s Preferences for Equality and Equity across 13 Individualistic and Collectivist Cultures. Developmental Science, 22, e12729.
https://doi.org/10.1111/desc.12729
[44] Israel, S., Hasenfratz, L., & Knafo-Noam, A. (2015). The Genetics of Morality and Prosociality. Current Opinion in Psychology, 6, 55-59.
https://doi.org/10.1016/j.copsyc.2015.03.027
[45] Jinpa, T. (2016). A Fearless Heart: How the Courage to be Compassionate Can Transform Our Lives. Avery.
[46] Jordan, J., Leliveld, M. C., & Tenbrunsel, A. E. (2015). The Moral Self-Image Scale: Measuring and Understanding the Malleability of the Moral Self. Frontiers in Psychology, 6, 1878. https://doi.org/10.3389/fpsyg.2015.01878
[47] Kaplan, R. M., & Saccuzzo, D. P. (2017). Psychological Testing: Principles, Applications, and Issues. Cengage Learning.
[48] Karandikar, S., Kapoor, H., Fernandes, S., & Jonason, P. K. (2019). Predicting Moral Decision-Making with Dark Personalities and Moral Values. Personality and Individual Differences, 140, 70-75. https://doi.org/10.1016/j.paid.2018.03.048
[49] Killen, M., & Smetana, J. G. (2015). Origins and Development of Morality. In M. E. Lamb, & R. M. Lerner (Eds.), Handbook of Child Psychology and Developmental Science: Socioemotional Processes (pp. 701-749). John Wiley & Sons, Inc.
https://doi.org/10.1002/9781118963418.childpsy317
[50] Korsgaard, C. M. (2010). Reflections on the Evolution of Morality. The Department of Philosophy at Amherst College. http://nrs.harvard.edu/urn-3:HUL.InstRepos:5141952
[51] Krebs, D. L. (2004). Cultivating Morality and Constructing Moral Systems: How to Make Silk Purses from Sows’ Ears. In C. Crawford, & C. Salmon (Eds.), Evolutionary Psychology, Public Policy and Personal Decisions (pp. 319-342). Lawrence Erlbaum Associates Publishers.
[52] Kurtines, W., & Pimm, J. B. (1983). The Moral Development Scale: A Piagetian Measure of Moral Judgment. Educational and Psychological Measurement, 43, 89-105.
https://doi.org/10.1177/001316448304300112
[53] Leeks, A., & West, S. (2019). Altruism in a Virus. Nature Microbiology, 4, 910-911.
https://doi.org/10.1038/s41564-019-0463-0
[54] Leming, J. S. (2008). Research and Practice in Moral and Character Education: Loosely Coupled Phenomena. In L. Nucci, & D. Narvaez (Eds.), Handbook of Moral and Character Education (pp. 150-174). Routledge. https://doi.org/10.4324/9780203931431-15
[55] Lickona, T. (1969). Piaget Misunderstood: A Critique of the Criticisms of His Theory of Moral Development. Merrill-Palmer Quarterly of Behavior and Development, 15, 337-350.
[56] Lieberman, D., Tooby, J., & Cosmides, L. (2003). Does Morality Have a Biological Basis? An Empirical Test of the Factors Governing Moral Sentiments Relating to Incest. Proceedings of the Royal Society of London. Series B: Biological Sciences, 270, 819-826.
https://doi.org/10.1098/rspb.2002.2290
[57] Lourenço, O., & Machado, A. (1996). In Defense of Piaget’s Theory: A Reply to 10 Common Criticisms. Psychological Review, 103, 143-164.
https://doi.org/10.1037/0033-295X.103.1.143
[58] Lu, H. J., & Chang, L. (2011). The Association between Self-Deception and Moral Self-Concept as Functions of Self-Consciousness. Personality and Individual Differences, 51, 845-849. https://doi.org/10.1016/j.paid.2011.07.014
[59] Malti, T., Gasser, L., & Buchmann, M. (2009). Aggressive and Prosocial Children’s Emotion Attributions and Moral Reasoning. Aggressive Behavior, 35, 90-102.
https://doi.org/10.1002/ab.20289
[60] Marquette, H. (2012). “Finding God” or “Moral Disengagement” in the Fight against Corruption in Developing Countries? Evidence from India and Nigeria. Public Administration and Development, 32, 11-26. https://doi.org/10.1002/pad.1605
[61] McDonald, N. M., & Messinger, D. S. (2011). The Development of Empathy: How, When, and Why. Free Will, Emotions, and Moral Actions: Philosophy and Neuroscience in Dialogue, 23, 333-359.
[62] McKay, R., & Whitehouse, H. (2015). Religion and Morality. Psychological Bulletin, 141, 447-473. https://doi.org/10.1037/a0038455
[63] McLeod, S. A. (2013, October 24). Kohlberg’s Stages of Moral Development. Simply Psychology. https://www.simplypsychology.org/kohlberg.html
[64] Migliore, S., D’Aurizio, G., Parisi, F., Maffi, S., Squitieri, B., Curcio, G., & Mancini, F. (2018). Moral Judgment and Empathic/Deontological Guilt. Psychological Reports, 122, 1395-1411. https://doi.org/10.1177/0033294118787500
[65] Moheghi, M., Ghorbanzadeh, M., & Abedi, J. (2020). The Investigation and Criticism Moral Development Ideas of Kohlberg, Piaget and Gilligan. International Journal of Multicultural and Multireligious Understanding, 7, 362-374.
[66] Moreira, L. V., Souza, M. L. D., & Guerra, V. M. (2019). Validity Evidence of a Brazilian Version of the Moral Foundations Questionnaire. Psicologia: Teoria e Pesquisa, 35, e35513. https://doi.org/10.1590/0102.3772e35513
[67] Nocera, T. R., Dahlen, E. R., Mohn, R. S., Leuty, M. E., & Batastini, A. B. (2021). Dark Personality Traits and Anger in Cyber Aggression Perpetration: Is Moral Disengagement to Blame? Psychology of Popular Media. Advance online publication.
https://doi.org/10.1037/ppm0000295
[68] Noval, L. J., & Stahl, G. K. (2017). Accounting for Proscriptive and Prescriptive Morality in the Workplace: The Double-Edged Sword Effect of Mood on Managerial Ethical Decision Making. Journal of Business Ethics, 142, 589-602.
https://doi.org/10.1007/s10551-015-2767-1
[69] Ok, E., Qian, Y., Strejcek, B., & Aquino, K. (2021). Signaling Virtuous Victimhood as Indicators of Dark Triad Personalities. Journal of Personality and Social Psychology, 120, 1634-1661. https://doi.org/10.1037/pspp0000329
[70] Parker, M. T., & Janoff-Bulman, R. (2013). Lessons from Morality-Based Social Identity: The Power of Outgroup “Hate”, Not Just Ingroup “Love”. Social Justice Research, 26, 81-96. https://doi.org/10.1007/s11211-012-0175-6
[71] Perugini, M., & Leone, L. (2009). Implicit Self-Concept and Moral Action. Journal of Research in Personality, 43, 747-754. https://doi.org/10.1016/j.jrp.2009.03.015
[72] Piaget, J. (1965). The Moral Judgment of the Child. Harcourt, Brace.
[73] Purzycki, B. G., Henrich, J., Apicella, C., Atkinson, Q. D., Baimel, A., Cohen, E. et al. (2018). The Evolution of Religion and Morality: A Synthesis of Ethnographic and Experimental Evidence from Eight Societies. Religion, Brain & Behavior, 8, 101-132.
https://doi.org/10.1080/2153599X.2016.1267027
[74] Quinn, R. A., Houts, A. C., & Graesser, A. C. (1994). Naturalistic Conceptions of Morality: A Question‐Answering Approach. Journal of Personality, 62, 239-262.
https://doi.org/10.1111/j.1467-6494.1994.tb00293.x
[75] Rabelo, A. L., & Pilati, R. (2021). Are Religious and Nonreligious People Different in Terms of Moral Judgment and Empathy? Psychology of Religion and Spirituality, 13, 101-110.
[76] Rest, J. R., Narvaez, D., Thoma, S. J., & Bebeau, M. J. (2000). A Neo-Kohlbergian Approach to Morality Research. Journal of Moral Education, 29, 381-395.
https://doi.org/10.1080/713679390
[77] Rose, J. D. (2012). Development of Moral Reasoning at a Higher Education Institution in Nigeria. Emerging Leadership Journeys, 5, 81-101.
[78] Ruedy, N. E., Moore, C., Gino, F., & Schweitzer, M. E. (2013). The Cheater’s High: The Unexpected Affective Benefits of Unethical Behavior. Journal of Personality and Social Psychology, 105, 531-548. https://psycnet.apa.org/doi/10.1037/a0034231
https://doi.org/10.1037/a0034231
[79] Saroglou, V. (2019). Religion and Related Morality across Cultures. In D. Matsumoto, & H. C. Hwang (Eds.), The Handbook of Culture and Psychology (pp. 724-785). Oxford University Press. https://doi.org/10.1093/oso/9780190679743.003.0022
[80] Shaffer, D. R. (1994). Do “Naturalistic” Conceptions of Morality Provide any Novel Answers? Journal of Personality, 62, 263-268.
https://doi.org/10.1111/j.1467-6494.1994.tb00294.x
[81] Shaffer, D. R. (2009). Social and Personality Development. Wadsworth.
[82] Sijtsema, J. J., Garofalo, C., Jansen, K., & Klimstra, T. A. (2019). Disengaging from Evil: Longitudinal Associations between the Dark Triad, Moral Disengagement, and Antisocial Behavior in Adolescence. Journal of Abnormal Child Psychology, 47, 1351-1365.
https://doi.org/10.1007/s10802-019-00519-4
[83] Simpson, A., McCurrie, C., & Rios, K. (2019). Perceived Morality and Antiatheist Prejudice: A Replication and Extension. The International Journal for the Psychology of Religion, 29, 172-183. https://doi.org/10.1080/10508619.2019.1568142
[84] Smith, K. B., Alford, J. R., Hibbing, J. R., Martin, N. G., & Hatemi, P. K. (2017). Intuitive Ethics and Political Orientations: Testing Moral Foundations as a Theory of Political Ideology. American Journal of Political Science, 61, 424-437.
https://doi.org/10.1111/ajps.12255
[85] Susewind, M., & Walkowitz, G. (2020). Symbolic Moral Self-Completion—Social Recognition of Prosocial Behavior Reduces Subsequent Moral Striving. Frontiers in Psychology, 11, Article ID: 560188. https://doi.org/10.3389/fpsyg.2020.560188
[86] Sy, M. P., Bontje, P., Ohshima, N., & Kiepek, N. (2020). Articulating the Form, Function, and Meaning of Drug Using in the Philippines from the Lens of Morality and Work Ethics. Journal of Occupational Science, 27, 12-21.
https://doi.org/10.1080/14427591.2019.1644662
[87] Tamul, D., Elson, M., Ivory, J. D., Hotter, J. C., Lanier, M., Wolf, J., & Martinez-Carrillo, N. I. (2020). Moral Foundations’ Methodological Foundations: A Systematic Analysis of Reliability in Research Using the Moral Foundations Questionnaire.
https://doi.org/10.31234/osf.io/shcgv
[88] Tan, J., & Hare, B. (2013). Bonobos Share with Strangers. PLoS ONE, 8, e51922.
https://doi.org/10.1371/journal.pone.0051922
[89] Uhlmann, E. L., Poehlman, T. A., Tannenbaum, D., & Bargh, J. A. (2011). Implicit Puritanism in American Moral Cognition. Journal of Experimental Social Psychology, 47, 312-320. https://doi.org/10.1016/j.jesp.2010.10.013
[90] Walsh, R. (2015). Wise Ways of Seeing: Wisdom and Perspectives. Integral Review, 11, 156-174.
[91] Weinreich, H. (1975). Kohlberg and Piaget: Aspects of Their Relationship in the Field of Moral Development. Journal of Moral Education, 4, 201-213.
https://doi.org/10.1080/0305724750040303
[92] Wendorf, C. A. (2001). History of American Morality Research, 1894-1932. History of Psychology, 4, 272-288. https://doi.org/10.1037/1093-4510.4.3.272
[93] Williams, M. L., Hazleton, V., & Renshaw, S. (1975). The Measurement of Machiavellianism: A Factor Analytic and Correlational Study of Mach IV and Mach V. Communications Monographs, 42, 151-159. https://doi.org/10.1080/03637757509375889
[94] Wright, J., & Nichols, R. (2014). The Social Cost of Atheism: How Perceived Religiosity Influences Moral Appraisal. Journal of Cognition and Culture, 14, 93-115.
https://doi.org/10.1163/15685373-12342112
[95] Xu, Y., Iran-Nejad, A., & Thoma, S. J. (2007). Administering Defining Issues Test Online: Do Response Modes Matter. Journal of Interactive Online Learning, 6, 10-27.
[96] Xu, Z. X., Ma, H. K., Wang, Y., & Li, J. (2019). Maybe I Am Not as Moral as I Thought: Calibrating Moral Identity after Immoral Action. Current Psychology, 38, 1347-1354.
https://doi.org/10.1007/s12144-017-9686-5
[97] Yam, K. C. (2018). The Effects of Thought Suppression on Ethical Decision Making: Mental Rebound versus Ego Depletion. Journal of Business Ethics, 147, 65-79.
https://doi.org/10.1007/s10551-015-2944-2
[98] Yilmaz, O., Harma, M., Bahçekapili, H. G., & Cesur, S. (2016). Validation of the Moral Foundations Questionnaire in Turkey and Its Relation to Cultural Schemas of Individualism and Collectivism. Personality and Individual Differences, 99, 149-154.
https://doi.org/10.1016/j.paid.2016.04.090
[99] Zeigler-Hill, V., Noser, A. E., Roof, C., Vonk, J., & Marcus, D. K. (2015). Spitefulness and Moral Values. Personality and Individual Differences, 77, 86-90.
https://doi.org/10.1016/j.paid.2014.12.050
[100] Zhu, W., He, H., Treviño, L. K., Chao, M. M., & Wang, W. (2015). Ethical Leadership and Follower Voice and Performance: The Role of Follower Identifications and Entity Morality Beliefs. The Leadership Quarterly, 26, 702-718.
https://doi.org/10.1016/j.leaqua.2015.01.004

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.