Kamis, 17 Mei 2018

Social Influence Crash Course Psychology #38

Social Influence Crash Course Psychology #38
If someone in a position of authority told
you to like, stop walking on the grass, you would stop walking on the grass, right? And
if they told you to help someone's grandma cross the street, or pick up your dog's poop,
or put your shoes on before you go into a store, you'd probably comply. But what if they ordered you to physically
hurt another person? You're probably thinking "No way! I could never do something like that."
But there's a good chance you're wrong. In the early 1960s, Yale University psychologist
Stanley Milgram began what would become one of social psychology's most famed and chilling
experiments. Milgram began his work during the widely publicized
trial of World War II Nazi war criminal Adolf Eichmann.

Eichmann's defense, along with other
Nazis', for sending millions of people to their deaths, was that he was simply following
the orders of his superiors. And while that may have been true, it didn't fly in court
and Eichmann was ultimately executed for his crimes. But the question got Milgram to thinking,
what might the average person be capable of when under orders? So, for his initial experiment, Milgram recruited
forty male volunteers using newspaper ads. He built a phony "shock generator" with a
scale of thirty switches that could supposedly deliver shocks in increments from 30 volts
up to 450 volts, labeled with terms like "slight shock" to "dangerous shock" up to simply "XXX."
He then paired each volunteer participant with someone who was also apparently a participant,
but was in fact one of Milgram's colleagues, posing as a research subject.

He had them
draw straws to see who would be the "learner" and who would be the "teacher." The volunteers
didn't realize that the draw was fixed so that they'd always be the teacher, while Milgram's
buddy would be the learner. So the fake learner was put into a room, strapped to a chair,
and wired up with electrodes. The teacher, the person who was being studied, and a researcher
who was played by an actor, went into another room with a shock generator that the teacher
had no idea was fake. The learner was asked to memorize a list of word
pairs, and the participant was told that he'd be testing the learner's recall of those words
and should administer an electric shock for every wrong answer, increasing the shock level a
little bit each time.

From here, the pretend learner purposely gave mainly wrong answers, eliciting
shocks from the participant. If a participant hesitated, perhaps swayed by the learner's
yelps of pain, the researcher gave orders to make sure he continued. These orders were
delivered in a series of four prods. The first was just "Please continue," and
if the participant didn't comply, the researcher issued other prods until he did.

He'd say
"The experiment requires you to continue" and then "It's absolutely essential that you continue"
and finally "You have no choice but to continue." Even Milgram was surprised by the first round
of experiments. About two-thirds of the participants ended up delivering the maximum 450 volt shock.
All of the volunteers continued to at least 300 volts. Over years, Milgram kept conducting
this experiment, changing the situation in different ways to see if it had any effect
on people's obedience. What he repeatedly found was that obedience was highest when
the person giving the orders was nearby and was perceived as an authority figure, especially
if they were from a prestigious institution.

This was also true if the victim was depersonalized,
or placed at a distance such as in another room. Plus, subjects were more likely to comply
with the orders if they didn't see anyone else disobeying, if there were no role models
of defiance. In the end, Milgram's path-breaking work sheds
some seriously harsh light on the enormous power of two of the key cornerstone topics of social
psychology: social influence and  We all conform to some sort of social norms,
like following traffic laws or even obeying the dress codes for different roles and environments.
When we know how to act in a certain group or setting, life just seems to go more smoothly.
Some of this conformity is non-conscious automatic mimicry, like how you're likely to laugh if
you see someone else laughing or nod your head when they're nodding. In this way, group
behavior can be contagious.

But overall, conformity describes how we adjust
our behavior or thinking to follow the behavior or rules of the group we belong to. Social
psychologists have always been curious about the degree to which a person might follow
or rebel against their group's social norms. During the early 1950s, Polish-American psychologist
Solomon Ash expressed the power of conformity through a simple test. In this experiment, the volunteer is told
that they're participating in a study on visual perception and is seated at a table with five
other people.

The experimenter shows the group a picture of a standard line and three comparison
lines of various lengths, and then asked the people to say which of the three lines matches
the comparison line. It's clear to anyone with any kind of good vision that the second
line is the right answer, but the thing is, most, if not all of the other people in the
group start choosing the wrong line. The participant doesn't know that those other people are all actors,
a common deception used in social-psychological research, and they're intentionally giving
the wrong answer. This causes the real participant to struggle with trusting their own eyes or
going with the group.

In the end most subjects still gave what they
knew was the correct answer, but more than a third were essentially just willing to give
the wrong answer to mesh with the group. Ash, and subsequent researchers, found that people
are more likely to conform to a group if they're made to feel incompetent or insecure and are
in a group of three or more people, especially if all those people agree. It also certainly
doesn't hurt if the person admires the group because of maybe their status or their attractiveness,
and if they feel that others are watching their behavior. We also tend to conform more if we're from
a culture that puts particular emphasis on respect for social standards.

This might sound
a little bit familiar, like, all of high school, fraternities or sororities, the big company you work for,
or any other group that you've ever been a part of. The classic experiments of Milgram and Ash
showed us that people conform for lots of different reasons, but they both underscored
the power of situation in conformity - whether that situation elicits respect for authority,
fear of being different, fear of rejection, or simply a desire for approval. This is known
as normative social influence, the idea that we comply in order to fuel our need to be
liked or belong. But, of course, groups influence our behavior
in more ways than just conformity and obedience.

For example, we may perform better or worse
in front of a group. This is called social facilitation and it's what might, say, help
you sprint the last hundred meters of a race if people are cheering you on, but it's also
what can make you nervous enough to forget the words to that poetry you were supposed
to be slamming in front of a crowd. But that's what can happen in front of a group,
what happens when you're actually part of a group? Do you work harder or start slacking?
One study found that if you blindfold students, hand them a rope and tell them to pull as
hard as they can in a game of tug-of-war, the subjects will put in less work if they
think they're part of the team instead of pulling by themselves. About 20% less, it
turns out.

This tendency to exert less effort when you're not individually accountable is
called social loafing. That's pretty good. You can now add the word "loafing" to your
scientific vocabulary. But a group's ability to either arouse or
lessen our feelings of personal responsibility can make us do more dangerous things than
just phone in some group homework assignment.

It can also lead to deindividuation, the loss
of self-awareness and restraint that can occur in group situations. Being part of a crowd
can create a powerful combination of arousal and anonymity; it's part of what fuels riots
and lynch mobs and online trolling. The less individual we feel, the more we're at the
mercy of the experience of our group, whether it's good or bad. And it should come as no surprise that the
attitudes and beliefs we bring to a group grow stronger when we talk with others who
share them.

This is a process psychologists know as group polarization, and it often translates
into a nasty "us" vs "them" dynamic. And you know what is great at polarizing groups?
The internet. The internet has made it easier than ever to connect like-minded people and
magnify their inclinations. This can of course breed haters, like racists may become more
racist in the absence of conflicting viewpoints, but it can, and often does, work for good,
promoting education, crowd-sourcing things like fundraising, and organizing people to
fight all kinds of worldsuck.

And group dynamics can not only affect our
personal decisions, they can also influence really big decisions on a larger, even national
scale. Groupthink is a term coined by social psychologist Irving Janis, to describe what
happens when a group makes bad decisions because they're too caught up in the unique internal
logic of their group. When a group gets wrapped up in itself and everyone agrees with each
other, no one stops to think about other perspectives. As a result, you get some big and bad ideas,
including some enormous historical fiascoes, like the Watergate cover-up and the Bay of
Pigs invasion and the Chernobyl nuclear reactor accident.

So while two heads may often be
better than one, it's important to make sure those heads are still open to different opinions
or they could do some really dumb stuff. In the end, it's best to understand ourselves
and our decisions as informed simultaneously by both individual and group factors, personality,
and situation. And don't get too freaked out about what people are capable of; I mean,
just think back to Milgram's experiment. For the two-thirds of us who would shock someone
to death in the right circumstance, there's another third who wouldn't, reminding us that while
group behavior is powerful, so is individual choice.

Today you learned about the power of social
influence, conformity, and authority. We looked at the shocking results of the famous Milgram
experiment, the concept of automatic mimicry, and how Solomon Ash proved the power of conformity
in situation. You also learned how normative social influence sways us, how social facilitation
can make or break your performance and how social loafing makes people lazy in a group.
And finally, we discussed how harmful deindividuation, group polarization, and groupthink can be. Thank you for watching, especially to all
of our Subbable subscribers who make Crash Course possible for themselves, and for everyone
else.

To find out how you can become a supporter, just go to subbable.Com This episode was written by Kathleen Yale,
edited by Blake de Pastino, and our consultant is Dr. Ranjit Bhagwat. Our director and editor
is Nicholas Jenkins, the script supervisor is Michael Aranda, who is also our sound designer,
and the graphics team is Thought Cafe..

Tidak ada komentar:

Posting Komentar