Project MKUltra, or MK-Ultra, was a covert human research program into behavioural modification run by the CIA's Office of Scientific Intelligence. The program began in the early 1950s, was officially sanctioned in 1953, was reduced in scope in 1964, further curtailed in 1967 and finally halted in 1973.[1]It controversially used unwitting U.S. and Canadian citizens as its test subjects.[2][3][4][5] MKUltra involved the use of many methodologies to manipulate people's individual mental states and alter brain functions, including the surreptitious administration of drugs (especially LSD) and other chemicals,hypnosis,[6] sensory deprivation, isolation, verbal and sexual abuse, as well as various forms of torture.
The research was undertaken at 80 institutions, including 44 colleges and universities, as well as hospitals, prisons and pharmaceutical companies.[7]The CIA would operate through these institutions using front organizations, although sometimes top officials at these institutions would be aware of the CIA's involvement.[8] MKUltra was allocated 6 per cent of total CIA funds.[9]
Project MKUltra was first brought to wide public attention in 1975 by the U.S. Congress, through investigations by the Church Committee, and by a presidential commission known as the Rockefeller Commission. Investigative efforts were hampered by the fact that CIA Director Richard Helms ordered all MKUltra files destroyed in 1973; the Church Committee and Rockefeller Commission investigations relied on the sworn testimony of direct participants and on the relatively small number of documents that survived Helms' destruction order.[10]
In 1977, a Freedom Of Information Act request uncovered a cache of 20,000 documents[11] relating to project MKUltra, which led to Senate hearings later that same year.[3] In July 2001 most surviving information regarding MKUltra was officially declassified.[
The Stanford prison experiment was a study of the psychological effects of becoming a prisoner or prison guard. The experiment was conducted at Stanford University from August 14 to August 20 of 1971 by a team of researchers led by psychology professor Philip Zimbardo.[1] It was funded by the US Office of Naval Research[2] and was of interest to both the US Navy and Marine Corps as an investigation into the causes of conflict between military guards and prisoners.
Twenty-four male students out of 75 were selected to take on randomly assigned roles of prisoners and guards in a mock prison situated in the basement of the Stanford psychology building. The participants adapted to their roles well beyond Zimbardo's expectations, as the guards enforced authoritarian measures and ultimately subjected some of the prisoners to psychological torture. Many of the prisoners passively accepted psychological abuse and, at the request of the guards, readily harassed other prisoners who attempted to prevent it. The experiment even affected Zimbardo himself, who, in his role as the superintendent, permitted the abuse to continue. Two of the prisoners quit the experiment early and the entire experiment was abruptly stopped after only six days. Certain portions of the experiment were filmed and excerpts of footage are publicly available.
When the Abu Ghraib military prisoner torture and abuse scandal was publicized in March of 2004, many observers[who?] were immediately struck by its many similarities to the Stanford Prison experiment. Chief among them was Zimbardo himself, who paid close attention to the details of the story. He was dismayed by official military and government representatives' shifting the blame for the torture and abuses in the Abu Ghraib American military prison on to "a few bad apples" rather than acknowledging it as possibly systemic problems of a formally established military incarceration system.
Eventually, Zimbardo became involved with the defense team of lawyers representing one of the Abu Ghraib prison guards, Staff Sergeant Ivan "Chip" Frederick. He was granted full access to all investigation and background reports, and testified as an expert witness in SSGT Frederick's court martial, which resulted in an eight-year prison sentence for Frederick in October 2004.
Zimbardo drew from his participation in the Frederick case to write the book The Lucifer Effect: Understanding How Good People Turn Evil, published by Random House in 2007, which deals with the striking similarities between his own Stanford Prison Experiment and the Abu Ghraib abuses.
Authority (from the Latin auctoritas) is a right conferred by recognized social position. Authority often refers to power vested in an individual or organization by the state. Authority can also refer to recognized expertise in an area of academic knowledge. An Authority (capitalized) refers to a governing body upon which certain authority (with lower case a) is vested; for example, the Puerto Rico Electric Power Authority.
The Milgram experiment on obedience to authority figures was a series of notable social psychology experiments conducted by Yale Universitypsychologist Stanley Milgram, which measured the willingness of study participants to obey an authority figure who instructed them to perform acts that conflicted with their personal conscience. Milgram first described his research in 1963 in an article published in the Journal of Abnormal and Social Psychology,[1] and later discussed his findings in greater depth in his 1974 book, Obedience to Authority: An Experimental View.[2]
The experiments began in July 1961, three months after the start of the trial of German Nazi war criminal Adolf Eichmann in Jerusalem. Milgram devised his psychological study to answer the question: "Was it that Eichmann and his accomplices in the Holocaust had mutual intent, in at least with regard to the goals of the Holocaust?" In other words, "Was there a mutual sense of morality among those involved?" Milgram's testing suggested that it could have been that the millions of accomplices were merely following orders, despite violating their deepest moral beliefs. The experiments have been repeated many times, with consistent results within societies, but different percentages across the globe.[3] The experiments were also controversial, and considered by some scientists to be unethical, physically or psychologically abusive, motivating more thorough review boards or committee reviews for working with human subjects.
Robert B. Cialdini is Regents’ Professor Emeritus of Psychology and Marketing at Arizona State University.
He is best known for his popular book on persuasion and marketing, Influence: The Psychology of Persuasion. Influence has sold over 2 million copies and has been translated into twenty-six languages. It has been listed on the New York Times Business Best Seller List. Fortune Magazine lists Influence in their "75 Smartest Business Books."
6 key principles of persuasion by Robert Cialdini
- Reciprocity - People tend to return a favor, thus the pervasiveness of free samples in marketing. In his conferences, he often uses the example of Ethiopia providing thousands of dollars in humanitarian aid to Mexico just after the 1985 earthquake, despite Ethiopia suffering from a crippling famine and civil war at the time. Ethiopia had been reciprocating for the diplomatic support Mexico provided when Italy invaded Ethiopia in 1935. The good cop/bad cop strategy is also based on this principle.
- Commitment and Consistency - If people commit, orally or in writing, to an idea or goal, they are more likely to honor that commitment because of establishing that idea or goal as being congruent with their self image. Even if the original incentive or motivation is removed after they have already agreed, they will continue to honor the agreement. For example, in car sales, suddenly raising the price at the last moment works because the buyer has already decided to buy. Cialdini notes Chinese brainwashing on American prisoners of war to rewrite their self image and gain automatic unenforced compliance. See cognitive dissonance.
- Social Proof - People will do things that they see other people are doing. For example, in one experiment, one or more confederates would look up into the sky; bystanders would then look up into the sky to see what they were seeing. At one point this experiment aborted, as so many people were looking up that they stopped traffic. See conformity, and the Asch conformity experiments.
- Authority - People will tend to obey authority figures, even if they are asked to perform objectionable acts. Cialdini cites incidents such as the Milgram experiments in the early 1960s and the My Lai massacre.
- Liking - People are easily persuaded by other people that they like. Cialdini cites the marketing of Tupperware in what might now be called viral marketing. People were more likely to buy if they liked the person selling it to them. Some of the many biases favoring more attractive people are discussed. See physical attractiveness stereotype.
- Scarcity - Perceived scarcity will generate demand. For example, saying offers are available for a "limited time only" encourages sales.
Conformity is the act of matching attitudes, beliefs, and behaviors to group norms.[1] Norms are implicit rules shared by a group of individuals, that guide their interactions with others and among society or social group. This tendency to conform occurs in small groups and/or society as a whole, and may result from subtle unconscious influences, or direct and overt social pressure. Conformity can occur in the presence of others, or when an individual is alone. For example, people tend to follow social norms when eating or watching television, even when alone.
People often conform from a desire for security within a group—typically a group of a similar age, culture, religion, or educational status. This is often referred to as groupthink: a pattern of thought characterized by self-deception, forced manufacture of consent, and conformity to group values and ethics, which ignores realistic appraisal of other courses of action. Unwillingness to conform carries the risk of social rejection. Conformity is often associated with adolescence and youth culture, but strongly affects humans of all ages.[2]
Although peer pressure may manifest negatively, conformity can have good or bad effects depending on the situation. Driving on the correct side of the road could be seen as beneficial conformity.[3] Conformity influences formation and maintenance of social norms, and helps societies function smoothly and predictably via the self-elimination of behaviors seen as contrary to unwritten rules. In this sense it can be perceived as (though not proven to be) a positive force that prevents acts that are perceptually disruptive or dangerous.
As conformity is a group phenomenon, factors such as group size, unanimity, cohesion, status, prior commitment, and public opinion help determine the level of conformity an individual displays.
Conformity is a type of social influence involving a change in belief or behavior in order to fit in with a group.
This change is in response to real (involving the physical presence of others) or imagined (involving the pressure of social norms / expectations) group pressure.
Conformity can also be simply defined as “yielding to group pressures” (Crutchfield, 1955). Group pressure may take different forms, for example bullying, persuasion, teasing, criticism etc. Conformity is also known as majority influence (or group pressure).
The term conformity is often used to indicate an agreement to the majority position, brought about either by a desire to ‘fit in’ or be liked (normative) or because of a desire to be correct (informational), or simply to conform to a social role (identification).
There have been many experiments in psychology investigating conformity and group pressure.
Jenness (1932) was the first psychologist to study conformity. His experiment was an ambiguous situation involving a glass bottle filled with beans. He asked participants individually to estimate how many beans the bottle contained. Jenness then put the group in a room with the bottle, and asked them to provide a group estimate through discussion. Participants were then asked to estimate the number on their own again to find whether their initial estimates had altered based on the influence of the majority. Jenness then interviewed the participants individually again, and asked if they would like to change their original estimates, or stay with the group's estimate. Almost all changed their individual guesses to be closer to the group estimate.
However, perhaps the most famous conformity experiment was bySolomon Asch (1951) and his line judgment experiment.
Types of Conformity
Man (1969) states that “the essence of conformity is yielding to group pressure”. He identified three types of conformity: Normative, informational and ingratiational.
Kelman (1958) distinguished between three different types of conformity: Compliance, Internalization and identification.
Normative Conformity | Informational Conformity |
---|---|
|
|
Compliance | Internalization |
|
|
Ingratiational Conformity | Identification |
|
|
Sherif (1935) Autokinetic Effect Experiment
Aim: Sherif (1935) conducted an experiment with the aim of demonstrating that people conform to group norms when they are put in an ambiguous (i.e. unclear) situation.
Method: Sherif used a lab experiment to study conformity. He used the autokinetic effect – this is where a small spot of light (projected onto a screen) in a dark room will appear to move, even though it is still (i.e. it is a visual illusion).
It was discovered that when participants were individually tested their estimates on how far the light moved varied considerably (e.g. from 20cm to 80cm). The participants were then tested in groups of three. Sherif manipulated the composition of the group by putting together two people whose estimate of the light movement when alone was very similar, and one person whose estimate was very different. Each person in the group had to say aloud how far they thought the light had moved.
Results: Sherif found that over numerous estimates (trials) of the movement of light, the group converged to a common estimate. As the figure below shows: the person whose estimate of movement was greatly different to the other two in the group conformed to the view of the other two.
Sherif said that this showed that people would always tend to conform. Rather than make individual judgments they tend to come to a group agreement.
Conclusion: The results show that when in an ambiguous situation (such as the autokinetic effect), a person will look to others (who know more / better) for guidance (i.e. adopt the group norm). They want to do the right thing but may lack the appropriate information. Observing others can provide this information. This is known as informational conformity.
Further Information
References
Asch, S.E. (1951). Effects of group pressure upon the modification and distortion of judgments. In H. Guetzkow (Ed.), Groups, leadership and men. Pittsburg, PA: Carnegie Press.
Crutchfield, R. (1955). Conformity and Character. American Psychologist, 10, 191-198.
Jenness, A. (1932). The role of discussion in changing opinion regarding a matter of fact. The Journal of Abnormal and Social Psychology, 27 , 279-296.
Kelman, H. C. (1958). "Compliance, identification, and internalization: three processes of attitude change". Journal of Conflict Resolution, 2, 51–60.
Mann, L (1969). Social Psychology. New York: Wiley.
Sherif, M. (1935). A study of some social factors in perception. Archives of Psychology, 27(187) .
How to cite this article:
McLeod, S. A. (2007). Conformity in Psychology.
Retrieved from
http://www.simplypsychology.org/conformity.html
Retrieved from
http://www.simplypsychology.org/conformity.html
The Asch conformity experiments were a series of laboratory studies published in the 1950s that demonstrated a surprising degree ofconformity to a majority opinion. These are also known as the Asch Paradigm.
In a control group, with no pressure to conform to an erroneous view, only one participant out of 35 ever gave an incorrect answer. Solomon Asch hypothesized that the majority of participants would not conform to something obviously wrong; however, when surrounded by individuals all voicing an incorrect answer, participants provided incorrect responses on a high proportion of the questions (32%). Seventy-five percent of the participants gave an incorrect answer to at least one question.
Variations of the basic paradigm tested how many cohorts were necessary to induce conformity, examining the influence of just one cohort and as many as fifteen. Results indicate that one cohort has virtually no influence and two cohorts have only a small influence. When three or more cohorts are present, the tendency to conform increases only modestly. The maximum effect occurs with four cohorts. Adding additional cohorts does not produce a stronger effect.
The unanimity of the confederates has also been varied. When the confederates are not unanimous in their judgment, even if only one confederate voices a different opinion, participants are much more likely to resist the urge to conform (only 5-10% conform) than when the confederates all agree. This finding illuminates the power that even a small dissenting minority can have. Interestingly, this finding holds whether or not the dissenting confederate gives the correct answer. As long as the dissenting confederate gives an answer that is different from the majority, participants are more likely to give the correct answer. Men show around half the effect of women (tested in same-sex groups); and conformity is higher among members of an ingroup..
The Asch conformity experiments are often interpreted as evidence for the power of conformity and normative social influence.[2][3] That is, the willingness to conform publicly in order to attain social reward and avoid social punishment. Others have argued that it is rational to use other people's judgments as evidence.[4] Along the lines of the latter perspective, the Asch conformity experiments are cited as evidence for the self-categorization theory account of social influence. From that perspective the Asch results are interpreted as an outcome ofdepersonalization processes whereby the participants expect to hold the same opinions as similar others.[
The BBC Prison Study explores the social and psychological consequences of putting people in groups of unequal power. It examines when people accept inequality and when they challenge it.
Findings from the study were first broadcast by the BBC in 2002. They have since been published in leading scientific journals and textbooks and have also entered the core student syllabus. They have changed our basic understanding of how groups and power work.
No comments:
Post a Comment