Understanding Cult Membership:
Beyond “Drinking the Kool-Aid”

(Melissa Greiser wrote this thesis to fulfill a graduation requirement as an Honors student at SUNY New Paltz in 2019.)

Abstract

While there is a plethora of research discussing the concepts of social psychology that are involved in cult membership, which explain that the people involved with cults are typical individuals and there are many basic factors that contribute to their involvement, public perception of cults and their members still seems to be deeply negative. It is possible that if these studies were more widely acknowledged, public perception of cult members would become less negative. Examining the psychology behind cult membership can shed light on the many factors that influence human behavior, which may make it easier for the public to understand how cults can be appealing. Fundamental concepts of social psychology, including affiliation motivation and the need to belong, persuasion and the factors that are responsible for making it more effective, cognitive dissonance, ingroup bias, and social identity theory, can be used to explain how people become involved in cults and why they choose to remain in the group.

Introduction

When it comes to cult members, the general consensus seems to be deeply negative. Several studies examine public perception of cult members, the negative stereotypes and assumptions people hold, and the instances in which the media perpetuate these ideas. Cult members have been depicted as brainwashed followers (Bromley, 1993), and cults have been portrayed as being filled with violence, fraud, strange religious practices, and sexual depravity (Neal, 2011). Contrary to these widely held beliefs and media portrayals, it seems that cult membership can be explained, at least in part, by common psychological phenomena.

Psychological research suggests that cult members do not seem to be significantly different from ourselves or other people that we encounter every day. Additionally, the motivations and concepts that lead to cult membership may be more ubiquitous than they initially seem to the general public, and the persuasion techniques that cult leaders implement are some of the same that business leaders and marketing teams take advantage of on a regular basis. Analyzing cult membership through a psychological lens, taking into consideration fundamental human motivations, social influence, and persuasion, can help to mitigate the stigma and stereotypes regarding cult members and generate a more accurate discussion, making it easier to understand how cults can be appealing and why many typical people have become involved with such groups.

Defining “Cult”

When examining cults and their members, it is necessary to first define the term cult. Researchers have proposed a number of different definitions; the characteristics deemed necessary to qualify a group as a cult can vary greatly and some of these characteristics can be seen in mainstream, socially accepted groups. However, these definitions seem to focus on a few common characteristics, demonstrating at least some agreement between researchers about what it means for a group to be a cult. Galanter (1989) describes cults as charismatic groups, which have a shared belief system, maintain social cohesion, are run according to behavioral norms, and are led by a charismatic leader. Rhoads (1997, para. 2) defines a cult as “a group of people who organize around a strong authority figure.” This is a basic description that can be applied to many types of groups, but Rhoads explained that cults are distinct due to their extensive recruitment (traveling to find new members, creating a website and several videotapes advertising the group, or encouraging members to tell their friends and family about the group), methods of controlling members and encouraging changes in their identities (controlling their diets or sexual behaviors, insisting members give up the people and things that previously valued and that their membership with the group become important to their identities), and self-serving goals (cult leaders wanting to be seen as important or good or wanting to have power over their members). Groups that fall into the cult category typically exert immense influence over the personal details of the members’ lives, isolating them from outside social factors or support systems and insisting that adhering to the group’s strict guidelines is the best way to live. The leader will attempt to disguise his desire for power and control as he insists that he is looking out for the best interests of the members (Olsson, 2013). Cults leaders run the group according to their own agendas, taking advantage of coercive techniques to control the details of their members’ lives, and ultimately harming those involved.

Public Perception of Cult Members

The descriptions of cults and their practices demonstrate the differences between these types of groups and socially accepted mainstream groups. As mentioned, cults have a number of negative characteristics, such as their self-serving goals disguised as altruism and the extreme measures they take to accomplish these goals. After catastrophic cult conditions or events have been revealed, such as the siege that occurred in Waco, Texas with the Branch Davidians (Hinds, 1993), the mass suicide of Heaven’s Gate (Rabinovitz, 1997), and the mass suicide/homicide of Peoples Temple (Cahill, 1979), it seems likely for people to wonder how the cult members had become involved with these groups in the first place. People may subsequently make assumptions that create a division between themselves and those that have been involved in cults. One way to cope with shocking or negative things is through distancing. When researchers presented individuals with images that depicted suffering, they sometimes engaged in distancing. When the images being shown were from locations farther from the individual, it seemed to be easier to rely on denial, stereotyping, dehumanization, and blaming the victim (Fernández Villanueva & Revilla Castro, 2016). Being physically farther away from a victim may also cause an individual to feel less personally responsible for that person’s suffering, making it easier to accept and to potentially continue inflicting pain on them (Milgram, 2004).

Distance may be created by assuming that there is some personal flaw or dispositional difference within cult members that led to their cult involvement. This can potentially provide comfort as they reassure themselves that they could never become involved in a cult group themselves and that those who had been involved with a cult had gotten what they deserved (Riggio & Garcia, 2009). This is an example of the Fundamental Attribution Error, which is the tendency for people to explain others’ behaviors as products of their dispositions rather than products of the situation (Ross, 1977). Assuming that people’s behavior is the result of their dispositions may seem to be easier and more helpful than assuming the situation is responsible; one’s personality is typically more obvious than the situational factors that are contributing to their behavior, and making assumptions about an individual’s personality makes it easier to create expectations for future behavior (Ichheiser, 1970). This concept was applied directly to Peoples Temple and Jonestown in a study conducted by Riggio and Garcia (2009), in which students were either exposed to a documentary that explored the situational factors involved with the events at Jonestown or a lecture that explained the Fundamental Attribution Error. Following the documentary or lecture, participants were asked to complete a measure, which detailed an individual’s day and asked to rate how important various situational and dispositional factors were to the day’s events. These responses were compared to the responses of a control group, which had been given a lecture about research methods. They found that students who had been made aware of the power of the situation through the documentary or taught about the concept of the Fundamental Attribution Error were less likely to engage in these assumptions and less likely to overestimate dispositional causes for the members’ involvement with the group as compared to students in the control group. This suggests it is possible to make the tendency to engage in the Fundamental Attribution Error less likely by educating the public about psychological concepts and situational factors involved with cults.

Zimbardo (2005) explained that it can be problematic to assume that one’s behaviors reflect their personality, and he insisted that situational factors are much more powerful than people usually realize. He maintains that behavior cannot be examined without considering the context in which the behavior occurs, especially since the responses from other people will help to encourage or discourage the behavior in the future. Milgram’s (1963) obedience study, in which participants were urged by an authority figure in a white lab coat to continue electrocuting another person whenever they answered a question incorrectly, also effectively demonstrates the power of situational factors. At one point in the study, the participants were led to believe that the other person had been in a significant amount of pain and had possibly died, but still approximately 60% of participants decided to carry on with the increasingly strong shocks; they acted in ways that were not consistent with their usual behavior or the expectations of researchers that Milgram consulted before the start of the study (Milgram, 1963). The same influence of situational factors has been demonstrated in less complex contexts as well. In Asch’s (1951) conformity study, participants had to determine which line matched the length of the given line and were able to answer the question correctly, but once the participants heard others give a different answer, they abandoned their correct answer for the incorrect one the confederates had reported. All of these studies support the idea that contexts can greatly impact the people within them, and, just like the circumstances presented in these studies, the situational pressures that exist in cults may plausibly produce the same compliance, conformity, and obedience found in these results.

In addition to this tendency to assume personality from behavior, people may also rely on schemas and stereotypes. Ashmore and Del Boca (1981) define a stereotype as “a set of beliefs about the personal attributes of a group of people” (16). People may be inclined to depend on stereotypes because attending to every detail of the world around us may be beyond our capabilities, and simplifying things seems to make it easier for human beings to understand them in general terms while avoiding expending too much of their cognitive resources on nuances (Levine, 2003). There are many stereotypes that exist regarding cults, including the assumption that cults always have massive followings, all cult leaders have amassed great wealth, and the followers have been brainwashed into docile compliance (Bromley, 1993). According to Neal’s (2011) analysis of television shows from 1958 to 2008, the media consistently perpetuate stereotypes about cults and their members, portraying them in the most extreme ways, emphasizing violence, fraud, and differences in clothing, setting, and lifestyle, lending credibility to mainstream religions while discrediting alternative religions or other groups. The media may portray all “cultic” groups as the same, participating in similar rituals and posing similar problems (Beckford, 1985). Neal (2011) states, “[T]elevision remains one of the most influential vehicles in American culture for the reification and proliferation of stereotypical imagery” (85). Both television news and fictional television shows seem to participate in this cycle of stereotypes, and since approximately 96% of households in the United States own a television, it is difficult to completely avoid being exposed to these stereotypes (Lynch, 2018). Television shows are fictional, so audiences may not necessarily be taking these fictional stories as factual. The creators of fictional shows may feel as if they are not responsible for generating realistic programs that can properly inform their audiences, but these shows may still have effects on the way people feel about the characters, events, or issues depicted. Researchers conducted a study that demonstrated how the consumption of fictional television shows can affect the viewers’ attitudes regarding politics and policies (Mutz & Nir, 2010). While, at least in theory, news sources are meant to convey accurate information, they may exaggerate stories for the sake of attracting viewers, especially when it comes to unusual stories (Kilgo, Harlow, García-Perdomo, & Salaverría, 2018). Perhaps then, when it comes to unique stories about alternative groups, the media will be more likely to use the term “cult,” which has come to carry a deeply negative connotation. Television news often reports on the aspects of a particular story that are consistent with the specific station’s agenda and established beliefs, providing their audience with only a snapshot of the events being discussed rather than the entirety of the story (Deikman, 2003).

Bromley (1993) argues that these myths presented by the media prevent the general public from putting more effort into understanding the mechanisms at work within cults. Robert Hicks (1993), a former police officer who worked with the Virginia Department of Criminal Justice Services, explained that, in his experience, law enforcement officers may be quick to use the term “cult” when dealing with alternative groups and are likely to assume that anyone involved with such groups are evil or morally corrupt in some way. Creating this gap between mainstream or acceptable groups and bad or unacceptable groups can dehumanize the members of cults. Hicks (1993) suggests that this language is what allowed the Bureau of Alcohol, Tobacco, Firearms, and Explosives to carry out the assault on the cult of the Branch Davidians, which resulted in about 80 deaths. According to his theory, since the cult members had been viewed as less than, or not quite human, the government may have felt less obligated to handle this situation properly and instead acted in a way that led to a great loss of life. Tabor and Gallagher (1995) also suggest the government’s rhetoric contributed to this tragedy. Once the government and the media started referring to the Branch Davidians as a “cult,” the general public may have engaged in the same assumptions made by law enforcement. By doing so, this “unfortunately made it all too easy and attractive to deny Koresh and the other students of the Seven Seals of biblical prophecy their full and complex humanity” (Tabor & Gallagher, 1995, 118).

It seems that distancing and devaluing others without attempting to understand them may make harming them less difficult; rather than viewing them as human beings, they may be seen only as a member of a certain, undesirable group or enemy (Deikman, 1990). The use of the term “cult” and the negative connotations that come along with it seem to have led both the government/law enforcement and the general public to make assumptions without putting any effort into becoming more educated about the group, which may also have made it easier to blame the cult leader and his followers rather than holding the government accountable for their part in the deadly situation.

This tendency to refer to groups such as the Branch Davidians as “cults” and to engage in stereotyping does not seem to end with the demise of the group. As seen with the Jonestown tragedy, the use of terms with negative connotations, negative perceptions, and negative treatments of group members persist long after the death of the group and even the death of said group members. The phrase “drinking the Kool-Aid” is still used in many different contexts and it seems that, at this point, many people may not even be aware of its origin (Moore, 2003).

According to Moore (2003), by using this phrase to convey unquestioning obedience in a variety of situations with no consideration for the original tragedy, with which many are still coping today, the survivors’ suffering is being systematically shunned. The experience of the cult members seems to have been ignored and reduced down to a phrase that transforms “tragedy to comedy” because it is “too terrifying to face directly” (Moore, 2003, 99). Olson (2006) conducted a study in which the public’s perceptions of cults were measured and compared with perceptions of New Religious Movements and perceptions of new Christian churches. Changing the term “cult” to NRM led to more positive responses, but the responses to questions regarding new Christian churches were more positive than responses to questions regarding both cults and NRMs.

While research shows that people hold negative views towards cult members and may try to put distance between themselves and the people that have been involved in cults, there do not seem to be any significant differences between “us” and “them.” The majority of cult members show no signs of mental health issues and come from “normal” families, and the remaining likely only show signs of “mild” psychological issues (Rhoads, 1997, para. 9). The desire to find differences between ourselves and the people who have been involved with or have fallen victim to the tragedies associated with cults may be an effective coping mechanism or system for simplifying incoming information, but may ultimately be dehumanizing and dangerous. Levine (2003) explained the better-than-average illusion, in which individuals tend to assume that they are less naive, less gullible, less conforming, more aware, more knowledgeable, possess above average critical thinking skills, and are overall above average than other people. Tim Carter (2003), who had been a member of Peoples Temple and experiences first-hand these negative effects, stated, “Many people maintain ‘they’ could never fall prey to a ‘cult’ or ‘mind control’ techniques, could never allow their beliefs to be manipulated, and could never surrender their independence” (para. 16). Many Jonestown survivors reveal through interviews, lectures, their writing, or art that they share that same sentiment, suffering in their own way not only from the losses they endured in 1978 but also from the judgment and stigma that still surrounds them today. While many people may hold negative stereotypes and assumptions regarding cult mrmbers, these beliefs may be inaccurate. However, analyzing cult membership in terms of typical human needs and motivations may reduce the prevalence of these negative ideas.

Affiliation Motivation and the Need to Belong

One important human motivation is called affiliation motivation, which is associated with the need to have social relationships (Schachter, 1959). Research suggests that these relationships may provide us with comfort and support. Schachter (1959) conducted a study in which he led participants to believe that they would be undergoing electric shocks. They were then given the choice to sit in a waiting room with others or alone. When participants were told that they would be subjected to a painful shock, they were more likely to choose to wait with other people, and they were even more likely to do so when they were under the impression that those other people were going to be enduring the same painful shocks. Results support the theorized importance of the human motivation to affiliate, especially when individuals are put in anxiety-inducing situations. Affiliation motivation has been studied in relation to an individual’s religious views as well (Van Cappellen, Fredrickson, Saroglou, & Corneille, 2017). The researchers hypothesized that an individual who considers themselves to be religious will have a stronger need or desire to form social relationships since religions are based on creating a community of individuals with shared beliefs. When individuals were instructed to sit in a waiting room, which contained a chair that had some other person’s belongings on it, those that identified themselves as being religious were more likely to sit closer to the chair. This behavior seems to be indicative of higher affiliation motivations, supporting the researchers’ hypothesis.

Many cults are religious in nature, and, just as some individuals may be motivated to join mainstream religious groups in order to establish and maintain social relationships, cult members may have also been influenced to become involved in their groups by affiliation motivation.

When considering affiliations, it may be helpful to discuss types of attachment as well. Attachment is an emotional bond between people that connects them across time and space (Bowlby, 1969). Bowlby (1982) studied attachment styles in children, suggesting that the need for attachment persists through adulthood. A child’s experience regarding attachments can be important enough to impact their future relationships, demonstrating that an individual’s relationships may be central to their identity. This apparent need for attachment early in life was thought to be driven solely by the need for nutrition, but studies with baby monkeys showed that this may not be the only factor that is important (Harlow, 1959). When presented with two fake mothers- one which was made of wire and wood, and the other which was covered in cloth- the monkeys consistently chose to spend more time clinging to the cloth monkey rather than the wire monkey, regardless of which fake mother was equipped with a nursing bottle. This suggests that seeking attachments that provide comfort may be a valuable and fundamental aspect of aspect of human nature.

Part of the need to form social relationships is the need to belong. According to Baumeister and Leary (1995), people are likely to seek out groups that make them feel like they belong and are accepted. Individuals readily form relationships with others with which they have something in common or others that they encounter frequently, and once formed, individuals are sensitive to losing these relationships. People who have been deprived of social relationships seem to be more at risk for psychological and physical health issues as well (Baumeister & Leary, 1995). This may mean becoming a part of an alternative group, like a cult, if they feel they don’t belong in any mainstream groups. Jim Jones, the leader of Peoples Temple, found popularity because of his inclusion of both white and African American people, providing a safe haven where people of any color could mingle and practice their religion (Layton, 1998). Charles Watson (1978), who had been a member of Manson Family and participated in the Tate-LaBianca murders, recounts the first time he met Charles Manson:

 [T]he first thing I felt was a sort of gentleness, an embracing kind of acceptance and love … Here I was, accepted in a world I’d never even dreamed about, mellow and at my ease. Charlie murmured in the background, something about love, finding love, letting yourself love. I suddenly realized that this was what I was looking for: love. (27)

The affiliation motivation and need to belong that contributes to an individual’s involvement with a religious group, community group, or any other group may be the same motivations that account for an individual’s involvement with a cult. Gebauer and Maio (2012) suggest that individuals may be motivated to believe in God, a figure that can symbolize acceptance, by the need to belong as well. While not all cults involve belief in a particular God, cults may still espouse beliefs of some sort of higher power or divine figure. Studies consistently demonstrate the potential influence that affiliation motivation and the need to belong can have and the way that they can push an individual to seek out and join various groups. It is also important to consider that, as Zimbardo (1997) points out, “No one ever joins a ‘cult.’ People join interesting groups that promise to fulfill their pressing needs. They become ‘cults’ when they are seen as deceptive, defective, dangerous, or as opposing basic values of their society” (14).

Processes of Indoctrination, Persuasion, and Cognitive Dissonance

The human drive to form social relationships and need to belong may contribute to an individual’s decision to seek out groups, including groups that encourage alternative lifestyles or groups that would be considered cults. Once the individual comes into contact with a cult, they are likely to be subjected to a process of indoctrination. Baron (2000) suggests that this process was responsible for the maintenance of groups such as Peoples Temple, the Branch Davidians, and Heaven’s Gate. He describes the four stages of the process, starting first with the

softening-up stage. During this stage, the individual is isolated from their previous life and social support system and is placed into conditions that cause stress and disorientation. Galanter (1989) stated that Jim Jones forced his followers to cut off communication with anyone outside of Jonestown and prevented them from being exposed to any new sources, and he explained that it seems that individuals may be more susceptible to social influence when in an unfamiliar situation that makes them feel or think differently than usual; these changes can interfere with beliefs and lead to the acceptance of new explanations and attitudes. Charles Manson persuaded his followers to move to Spahn Ranch, giving up their personal possessions and leaving their families behind. He would also often encourage drug use, which was disorienting and altered their state of mind (Watson, 1978). Drugs can influence perception, cognitions, and affect, and the physiological effects of drugs can impact mental states and social interactions (Galanter, 1989). Jim Jones was also known for providing a diet for his followers in Jonestown that would keep them too weak to fight back against him (Zimbardo, 2005). Next the individual is encouraged to engage in the cult’s norms and behaviors, which Baron (2000) calls the compliance stage. The first two stages seem to reflect the disrupt-then-reframe technique, which involves a small disruption that may be created by using unexpected phrasing followed by a direct reframing of the object as being desirable. This technique takes advantage of the effects of distraction on persuasion (Davis & Knowles, 1999). Salespeople who used this technique were more effective, and the mechanisms at work in this situation may also come into play within cults where leaders may introduce distracting conditions and then present the group’s characteristics in a positive light. Once the individual begins participating in the group’s behavioral conventions, he moves into the internalization stage, and starts to assimilate to this new culture. Finally in the consolidation stage, membership in the group becomes a part of the individual’s identity, and they may make sacrifices that demonstrate their commitment, as well as deny or rationalize the negative aspects of the group (Baron, 2000).

This process of exercising social control and convincing individuals to join a cult can be made more effective by the characteristics of the group’s leader, his physical appearance, his presentation of ideas, and his implementation of some basic persuasive techniques. A communicator’s appearance is important to consider when it comes to persuasion because it is easily observed by the target audience, and research exploring the impacts of a person’s attractiveness has demonstrated the privileges that have been afforded to those deemed more attractive (Patzer, 1985). In addition to these privileges related to friendships, relationships, and job opportunities, higher physical attractiveness has shown to improve the individual’s effectiveness when it comes to persuasion as well (Chaiken, 1979). Praxmarer (2011) maintains that regardless of the particular situation, if the source of the persuasion is attractive, the persuasion will be more successful. Attractiveness may seem more important when it comes to advertising for a beauty product than when one is considering joining a cult, but it seems that the individual may still be more likely to respond favorably when the group’s leader is attractive. Researchers suggest this increased compliance and preference for people who are considered attractive may be an evolutionary adaptation; people have come to naturally respond well to attractive people perhaps because, throughout the process of evolution, doing so has benefitted their chances of survival and so their genes and inclination to exhibit this behavior has been passed down to future generations (Little, Jones, & DeBruine, 2011). This theory is supported by a study with six-month-old infants, who consistently showed preferences for more attractive faces. This early emergence of a preference for attractiveness may act as the foundation of later behavior and more favorable treatment of more attractive individuals (Ramsey, Langlois, Hoss, Rubenstein, & Griffin, 2004). While there seems to be no definitive study that suggests cult leaders tend to be attractive, it is possible that some cult members may have found their leaders attractive and this may have played a role in them being persuaded.

As part of the process of indoctrination seen in cults, the leaders often isolate new members from their family and friends outside of the group. This isolation would likely prevent the members from being exposed to negative opinions towards the group or ideas that may contradict the beliefs of the group. When individuals are repeatedly exposed to a certain idea, even when that idea only comes from one source, they are more likely to assume that the idea is credible and widely accepted (Weaver, Garcia, Schwarz, & Miller, 2007). Cult leaders may attempt to bombard members with their beliefs, giving lectures or sermons or engaging in constant conversation. Jim Jones even went so far as installing loudspeakers around the grounds of Jonestown so that his persuasive communication with his followers was unrelenting (Baumeister & Bushman, 2016). The members of Peoples Temple in Jonestown may have been led to believe that all the members in the group believed in Jones’ ideas and that his ideas were the only credible ones since they were hearing them often. The frequency of these broadcasts may have been influential, but the fact that Jones even decided to use loudspeakers to communicate with his followers affected his persuasiveness as well. A study done by Cantril and Allport (1935) shows that when a message is conveyed over a loudspeaker, “the listener is on the whole less analytical, less alert, less involved personally and socially, and more passively receptive than he is in the face-to-face situation” (157).

While these specific features of the speaker and the way he presents his ideas seem to contribute to his ability to persuade his followers, some basic, everyday techniques may also be implemented by these cult leaders. One of these methods is called the foot-in-the-door technique; when someone complies with a small request, they are more likely to comply with a subsequent larger request made by the same person (Freedman & Fraser, 1966). As explained, cult leaders typically require members to sacrifice many aspects of their lives upon entering the group. These sacrifices represent the leader’s first requests. Later, when leaders ask members to make larger sacrifices, the members may be more inclined to comply. Jim Jones, for example, made it clear that he did not expect much from new members, but after some time, he would ask his followers to attend lengthy services, sign over their personal property and life insurance, live in a collective community, and even write false statement that Jones could then use as blackmail in case he ever felt it was necessary (Levine, 2003). Another relevant factor is the norm of reciprocity. This theory explains that when someone offers help to another, the individual that received the favor may feel compelled to return the favor in the future (Regan, 1971). Gouldner (1960) suggests that this reciprocity is essential for the development and maintenance of social relationships. He also explains the concept of exploitation, which occurs when one side of the relationship provides more than the other, like when members may sacrifice much of their lives, possessions, and identities, only to receive the ability to say they are a member of the group in return. When an individual received a soda from a confederate in a study, that person was then more likely to comply when the confederate asked them to buy raffle tickets (Regan, 1971). The act of offering a soda to another person seems to have influenced that person, making them feel more obligated to give something back. Mike Cartmell, who belonged to Peoples Temple prior to the tragedy in Jonestown, explains, “He [Jim Jones] gave you your five minutes, and in return, you gave him your life” (qtd. in Zimbardo, 2005, para. 83).

During the process of indoctrination and the consolidation stage, cult members who stay with the group may attempt to rationalize their involvement with the group. This rationalization is part of the concept of cognitive dissonance. This theory suggests that when an individual acknowledges that some of his cognitions are not consistent or his behavior is not consistent with his cognitions, he is put in an uncomfortable state and will be motivated to reduce this conflict by changing one of the components that are creating it (Festinger, 1957). According to Festinger and Carlsmith (1959), when an individual is encouraged to express an opinion with which they do not personally agree, they will modify their own beliefs to be more aligned with that expressed opinion. In their study, participants completed dull tasks, but were paid either one dollar or twenty dollars to tell a waiting participant that the tasks were actually enjoyable. Since the participants’ actions were not consonant with their thoughts, they experienced cognitive dissonance and attempted to resolve this conflict by rating the tasks favorably during the interview that followed. This conflict and attempted resolution was especially prominent when the individual was only paid one dollar since, not only did they have to endure monotonous tasks, they were not well-rewarded for their compliance. An individual seems to be more inclined to change their beliefs when the rewards they received after participating in such a situation were insufficient. Festinger (1961) explains:

It seems clear that the inclination to engage in behavior after extrinsic rewards are removed is not so much a function of past rewards themselves. Rather, and paradoxically, such persistence in behavior is increased by a history of nonrewards or inadequate rewards. I sometimes like to summarize all this by saying that rats and people come to love things for which they have suffered. (11)

Once members have dedicated themselves to the group and have observed the negative aspects of it, they may justify the time and energy they’ve spent and the sacrifices they’ve made in order to be a member of the group. They will try to explain their commitment by telling themselves that it must be worth it and will, therefore, become even more devoted to the cult. Cults seem to be a potential source of cognitive dissonance as they may introduce new beliefs to their members, but this perpetuates membership as individuals rely on and become more devoted to the group in order to reduce this dissonance. People may even recruit others to join the group and try to convince them the group’s beliefs are good and valid, and if these recruitment efforts are successful, they are able to further rationalize and convince themselves that the beliefs are good and valid (Festinger, 1957). Jim Jones actually forced his followers to express gratitude towards him and the group, replicating, in real life, Festinger and Carlsmith’s conditions (Zimbardo, 2005). It is possible that the act of declaring their happiness and love of Jonestown- when the members may not have all been truly feeling this way- had the same effect, making their dedication to the group even stronger even though they were being subjected to deeply unpleasant conditions.

Ingroup Bias/Outgroup Derogation and Social Identity Benefits

The concept of cognitive dissonance helps account for cult members’ continued involvement with the group, but understanding the tendency to engage in ingroup bias and the ways that an individual can benefit from their social identity can also explain why people remain in cults, or any other group for that matter. Ingroup bias is an individual’s preference for and more favorable treatment of people that are in their group (Turner, Brown, & Tajfel, 1979). Deikman (1990) explained that cult members may be threatened by outsiders and react by holding onto the belief that their group is special or superior, which results in the devaluing of individuals and beliefs that remain outside of their group. Just like the general public may not consider the motivations, precise mechanisms, and individual differences that exist within cults, once cult members are dedicated to the group, they may not attempt to understand the view of any outsiders and may instead assume that outsiders share all the same negative traits (Deikman, 1990).

In a study conducted by Van Cappellen, Fredrickson, Saroglou, and Corneille (2017), they tested the relationship between religiosity (strong religious beliefs and identities) and bias in social interactions. Their studies found that individuals with strong religious associations seemed to demonstrate ingroup bias. Participants in one study were told the chair in the waiting room had been occupied by either a Christian or an atheist, and the participants who themselves were Christian were more likely to sit closer to the occupied chair when they were led to believe that person was Christian. In another study, participants engaged in a Cyberball game (an online ball tossing game) with a Christian player, an atheist player, and another player of unknown religious associations. During this game, Christian participants again showed ingroup bias by favoring the Christian player. Sherif (1956) demonstrated that this tendency to exhibit ingroup bias seems to persist even when the groups we associate with are arbitrary. During his study, he assigned young boys to different groups at a summer camp. The two teams were initially kept separated and the boys in each group immediately started to form friendships, a group hierarchy, and came to a consensus on a group name. Once it was clear that the boys were identifying with the other members in their groups, the two groups were introduced to each other. Not only did they show a very strong ingroup bias, they also showed the accompanying outgroup derogation, treating members of the other group badly. The boys became competitive and aggressive, calling each other names and getting into physical fights. While the boys in this study had been randomly assigned, they had time to spend together and form bonds; other studies find evidence for a strong tendency to engage in ingroup bias in situations where there are even more basic commonalities between individuals. People are motivated to show a preference for other individuals based on sharing the same birthday, demonstrating more cooperation and making decisions that favor others with the same birthday more than if the other person did not have their same birthday (Miller, Downs, & Prentice, 1998).

These concepts of ingroup bias and outgroup derogation were studied more extensively by Turner, Brown, and Tajfel (1979), who tested their hypothesis that, when given the responsibility to determine how much money each individual should be given for participating in a study, participants made decisions that most favored their own group and acted as a disadvantage for the other groups. They found that participants would sacrifice earning for themselves and their groups for the sake of creating larger disparities between themselves and the other groups. Ingroup bias can be observed in young children as well as they may associate familiarity with goodness. While this ingroup bias does not necessarily mean the children will also exhibit outgroup derogation, it’s possible that it sets the stage for the child to be influenced by society later on and to develop prejudiced attitudes (Cameron, Alvarez, Ruble, & Fuligni, 2001). As Brewer (2007) observed, conflict is not a requirement for producing ingroup bias, and the presence of ingroup bias does not always mean the individual will treat the outgroup poorly. Evolutionary psychologists hypothesize that the tendency to create social categorizes and engage in ingroup bias stems from our ancestors’ tendencies to interact with people who were similar and familiar to them, which may have ensured their safety and increased their chances of survival (Brewer, 2007). Once an individual undergoes the process of indoctrination that is present in a cult and becomes invested in the group, they may engage in ingroup bias, just like members of any other group may be likely to do so.

The tendency to favor the group to which you belong can also be explained, in part, by social identity theory. Once an individual becomes a member of a group, they are likely to incorporate their membership in the group into their self-concept, considering their involvement as part of their identity (Turner, 1982). As explained, the desire to form social relationships is a powerful motivator, so being able to consider social relationships as associated with who we are as individuals seems to be important to us as social beings. Individuals may seek out favorable evaluations of themselves from others in order to maintain a positive self-esteem, but social identity theory describes how individuals may also express positive appraisals of the groups with which they are involved for the same purpose (Tajfel, 1982). Since the groups that one associates with can be very important to them and become a part of their identity, feeling good about those groups may help them feel better about themselves. Attempts to reduce cognitive dissonance may also contribute to this process; individuals who have experienced negative aspects of a group they have become dedicated to may try to justify their membership, looking for positive aspects of the group and convincing themselves that it is superior in some way. In addition to forming positive evaluations of their groups, individuals may also come to see their groups as being distinct from other groups and their members (Taylor & Moghaddam, 1994). Social identity theory explains that an individual’s social identity is one component of their whole identity and individuals are motivated to achieve and maintain positive social identities; this can be accomplished by establishing distinctions between ingroups and outgroups and creating more favorable evaluations towards their own groups compared to other groups (Tajfel & Turner, 2004). This reinforces the tendency to favor members of one’s own group and the potential to treat members of other groups poorly.

Conclusion

Contrary to popular belief, the negative assumptions and stereotypes that exist in regards to cult members do not tell an accurate story of their lives and involvement with cults. While looking for dispositional flaws or abnormal motivations may be easier and provide a more satisfying or comforting answer, evidence suggests that cult members are no different from any other person that belongs to any other group. Instead of continuing to blame cult members, who are often victims in many ways, it may be much more helpful to consider the situational factors and typical motivations, needs, and behaviors that contribute to cult membership.

As research has shown, human behaviors reflect not only the personality of individuals, but also the context in which the behaviors have occurred. Additionally, it is important to acknowledge that cults do not typically receive that label until after the fact, and that the people who have joined these groups do not do so with the idea that they would eventually be considered cults. Driven by affiliation motivation and the need to belong, typical individuals seek out groups to satisfy the human need for social relationships and acceptance. Sometimes people turn to small community clubs or large religious organizations, and sometimes others find themselves associating with a group that is later deemed a cult. In all of these cases, the individuals had become affiliated with a social system they believed to be good. Following one’s exposure to any group, there is usually a process of indoctrination that occurs to solidify the person’s commitment to the group. Cult leaders exercise social control over their followers, and there are many factors that can make their persuasion more effective, such as the leader’s attractiveness, the leader’s use of repetitive communication, the use of the foot-in-the-door technique, and taking advantage of the norm of reciprocity.

Cognitive dissonance may also play a role in one’s decision to remain in a cult, even after  experiencing the negative aspects cults are known for. Individuals will rationalize their cult involvement by convincing themselves the group is good and their membership is worth the hardships; it seems counterintuitive, but suffering can actually make an individual’s allegiance to the group even stronger. This dedication tends to manifest as ingroup bias, and individuals will favor others who are also members of their group. Not only does this perpetuate their membership as they will lend more credibility to fellow members or the beliefs of the group while simultaneously shunning outside opinions, it also conducive to social identity benefits.

Social identity theory maintains that individuals tend to express positive evaluations of the groups they are associated with, which translate to positive evaluations of themselves since they are members of those groups.

All of these factors may have contributed to the establishment and maintenance of many cults, including the Branch Davidians, the Manson Family, and Peoples Temple. Research aims to educate the general public and encourage a greater understanding of cults. This knowledge may help individuals recognize the warning signs of a harmful cult and avoid meeting the same fate as the members of the previously mentioned cults. Additionally, being able to acknowledge the many factors that influence cult membership allows for greater respect and sympathy for the victims of cult tragedies. The mass suicide/murder that occurred at Jonestown, for example, which took the lives of over nine hundred people and changed the lives of many more, cannot and should not be reduced to a popular phrase that casually gets thrown around; it was much more and far beyond “drinking the Kool-Aid.”

References

Asch, S. E. (1951). Effects of group pressure upon the modification and distortion of judgment. In H. Guetzkow (Ed.) Groups, leadership and men (177-190). Pittsburgh, PA: Carnegie Press.

Ashmore, R. D., & Del Boca, F. K. (1981). Conceptual approaches to stereotypes and stereotyping. In D. Hamilton (Ed.), Cognitive processes in stereotyping and intergroup behavior (1-35). London: Psychology Press.

Ayella, M. (1990). “They must be crazy”: Some of the difficulties in researching “cults”.  American Behavioral Scientist, 33, 562-577.

Baron, R. S. (2000). Arousal, capacity, and intense indoctrination. Personality & Social Psychology Review, 4, 238–254.

Baumeister, R. F., & Leary, M. R. (1995). The need to belong: Desire for interpersonal attachments as a fundamental human motivation. Psychological Bulletin, 117, 497–529.

Baumeister, R. F., & Bushman, B. J. (2016). Social psychology and human nature, comprehensive edition (4th ed.). Australia: Cengage.

Beckford, J. A. (1985). Cult controversies. London: Tavistock Publications. Bowlby J. (1969). Attachment and loss. New York: Basic Books.

Bowlby, J. (1982). Attachment and loss (2nd ed.). New York, NY: Basic Books.

Brewer, M. B. (2007). The importance of being we: human nature and intergroup relations. The American Psychologist, 62, 728-738.

Bromley, D. G. (1993) The mythology of cults. In J. R. Lewis (Ed.), From the Ashes: Making sense of Waco (121-124). Lanham, MD: Rowman & Littlefield Publishers.

Cahill, T. (1979, January 25). In the valley of the shadow of death: Guyana after the Jonestown massacre. Rolling Stone.

Cameron, J. A., Alvarez, J. M., Ruble, D. N., & Fuligni, A. J. (2001). Children’s lay theories about ingroups and outgroups: Reconceptualizing research on prejudice. Personality & Social Psychology Review, 5, 118–128.

Cantril, H., & Allport, G. W. (1935). The psychology of radio. New York: Harper & Brothers.

Carter, T. (2003). The big grey.

Chaiken, S. (1979). Communicator physical attractiveness and persuasion. Journal of Personality and Social Psychology, 37, 1387-1397.

Davis, B., & Knowles, E. S. (1999). A disrupt-then-reframe technique of social influence. Journal of Personality and Social Psychology, 76, 192-199.

Deikman, A. J. (1990). The wrong way home. Boston, MA: Beacon Press.

Deikman, A. J. (2003). Them and us: Cult thinking and the terrorist threat. Berkeley, CA: Bay Tree Publishing.

Fernández Villanueva, C. F., & Revilla Castro, J. C. (2016). “Distant” beings or “human” beings: Real images of violence and strategies of implication or distancing from victims. Communication & Society, 29, 103–118.

Festinger, L. (1957). A theory of cognitive dissonance. Stanford, CA: Stanford University Press. Festinger, L. (1961). The psychological effects of insufficient rewards. American Psychologist, 16, 1–11.

Festinger, L., & Carlsmith, J. M. (1959). Cognitive consequences of forced compliance. Journal of Abnormal & Social Psychology, 58, 203-210.

Freedman, J. L., & Fraser, S. C. (1966). Compliance without pressure: The foot-in-the-door technique. Journal of Personality & Social Psychology, 4, 195–202.

Galanter, M. (1989). Cults: Faith, healing, and coercion. New York, NY: Oxford University Press.

Gebauer, J. E., & Maio, G. R. (2012). The need to belong can motivate belief in God. Journal of Personality, 80, 465-501.

Gouldner, A. W. (1960). The norm of reciprocity: A preliminary statement. American Sociological Review, 25, 161–178.

Harlow, H. F. (1959). Love in infant monkeys. Scientific American, 200, 68-75.

Hicks, R. (1993). Cult label made Waco violence inevitable. In J. R. Lewis (Ed.), From the ashes: Making sense of Waco (63-65). Lanham, MD: Rowman & Littlefield Publishers.

Hinds, M. D. (1993, April 20). Death in Waco: The lost cause; Texas cult membership: Many lives, shared fate. The New York Times, 20. Retrieved from https://www.nytimes.com

Ichheiser, G. (1970). Appearances and realities. San Francisco, CA: Jossey-Bass Inc., Publishers.

Kilgo, D. K., Harlow, S., García-Perdomo, V., & Salaverría, R. (2018). A new sensation? An international exploration of sensationalism and social media recommendations in online news publications. Journalism, 19, 1497-1516.

Layton, D. (1998). Seductive poison. New York: Anchor Books, Doubleday.

Levine, R. (2003). The power of persuasion: How we’re bought and sold. Hoboken, NJ: John Wiley & Sons, Inc.

Little, A. C., Jones, B. C., & DeBruine, L. M. (2011). Facial attractiveness: Evolutionary based research. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, 366, 1638-59.

Lynch, J. (2018). Nielsen estimates that 119.9 million U.S homes have TVs for the upcoming season. Retrieved from https://www.adweek.com/convergent-tv/nielsen-estimates-that-119-9-million-u-s-homes-have-tvs-for-the-upcoming-season/.

Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67, 371-378.

Milgram, S. (2004). Obedience to authority: An experimental view. New York: Perennial Classics.

Miller, D. T., Downs, J. S., & Prentice, D. A. (1998). Minimal conditions for the creation of a unit relationship: The social bond between Birthday-Mates. European Journal of Social Psychology, 28, 475–481.

Moore, R. (2003). Drinking the Kool-Aid: The cultural transformation of a tragedy. Nova Religio, 7, 92-100.

Mutz, D. C., & Nir, L. (2010). Not necessarily the news: Does fictional television influence real-world policy preferences? Mass Communication and Society, 13, 196-217.

Neal, L. S. (2011). “They’re freaks!”: The cult stereotype in fictional television shows, 1958-2008. Nova Religio: The Journal of Alternative & Emergent Religion, 14, 81–107.

Olson, J. (2006). The public perception of “cults” and “new religious movements.” Journal for the Scientific Study of Religion, 45, 97–106.

Olsson, A. (2013). “Normal” compared to abnormal leaders and groups. The Journal of Psychohistory, 41, 39-43.

Patzer, G. L. (1985). The physical attractiveness phenomena. New York: Plenum Press.

Praxmarer, S. (2011). How a presenter’s perceived attractiveness affects persuasion for attractiveness-unrelated products. International Journal of Advertising, 30, 839–865.

Rabinovitz, J. (1997, March 30). Family is confident that son’s life and suicide were of ‘his own volition’. The New York Times, pp. 1001016. Retrieved from https://www.nytimes.com

Ramsey, J. L., Langlois, J. H., Hoss, R. A., Rubenstein, A. J., & Griffin, A. M. (2004). Origins of a stereotype: Categorization of facial attractiveness by 6-month-old infants. Developmental Science, 7, 201-211.

Regan, D. T. (1971). Effects of a favor and liking on compliance. Journal of Experimental Social Psychology, 7, 627-639.

Rhoads, K. (1997). Working psychology. Retrieved from http://www.workingpsychology.com/ Riggio, H. R., & Garcia, A. L. (2009). The power of situations: Jonestown and the fundamental attribution error. Teaching of Psychology, 36, 108–112.

Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. Advances in Experimental Social Psychology, 10, 173–220.

Schachter, S. (1959). The psychology of affiliation. Stanford, CA: Stanford University Press. Sherif, M. (1956). Experiments in group conflict. Scientific American, 195, 32–58.

Tabor, J. D., & Gallagher, E. V. (1995). Why Waco? Cults and the battle for religious freedom in America. Berkeley, CA: University of California Press.

Tajfel, H. (1982). Social psychology of intergroup relations. Annual Review of Psychology, 33, 1-39.

Tajfel, H., & Turner, J. C. (2004). The social identity theory of intergroup behavior. In J. T. Jost & J. Sidanius (Eds.), Political psychology: Key readings (276-293). New York, NY, US: Psychology Press.

Taylor, D. M., & Moghaddam, F. M. (1994). Theories of intergroup relations (2nd ed.). Westport, CT: Praeger Publishers.

Turner, J. C., Brown, R. J., & Tajfel, H. (1979). Social comparison and group interest in ingroup favoritism. European Journal of Social Psychology, 9, 187–204.

Turner, J. C. (1982). Towards a cognitive redefinition of the social group. In H. Tajfel (Ed.), Social identity and intergroup relations (15-40). Cambridge: Cambridge University Press.

Van Cappellen,, Fredrickson, B. L., Saroglou, V., & Corneille, O. (2017). Religiosity and the motivation for social affiliation. Personality and Individual Differences, 113, 24–31.

Watson, C. (1978). Will you die for me? (n.p.)

Weaver, K., Garcia, S. M., Schwarz, N., & Miller, D. T. (2007). Inferring the popularity of an opinion from its familiarity: A repetitive voice can sound like a chorus. Journal of Personality and Social Psychology, 92, 821–833.

Zimbardo, P. (1997). What messages are behind today’s cults? APA Monitor, 28, 14.

Zimbardo, P. (2005). State terror and state-sanctioned terrorism: Models of mind and behavior control in Orwell’s 1984, as operationalized Jim Jones in the Peoples Temple mass suicide/murders.