kirisutogomen: (crab)
kirisutogomen ([personal profile] kirisutogomen) wrote2008-05-16 01:33 pm

Iterated polarization games

I didn't do a very good job with this post yesterday. I've been trying to work on a problem that really requires a program written in a real grown-up language, but I'm doing it in a spreadsheet instead, which is several orders of magnitude less efficient, it's really crushing my computer, and I end up doing violently abbreviated versions of everything else. I really ought to learn how to code.

Anyway, the original article is here. (Cass R Sunstein (2002) "The Law of Group Polarization" Journal of Political Philosophy 10(2), 175–195)

Groups consisting of individuals with extremist tendencies are more likely to shift, and likely to shift more (a point that bears on the wellsprings of violence and terrorism); the same is true for groups with some kind of salient shared identity (like Republicans, Democrats, and lawyers, but unlike jurors and experimental subjects). When like-minded people are participating in "iterated polarization games" — when they meet regularly, without sustained exposure to competing views — extreme movements are all the more likely.


The problem with the parentheses was that the PDF I copied it from doesn't allow copy and paste, so I had to type it in myself. Sorry.

(Also see Hotelling Beach)
dpolicar: (Default)

[personal profile] dpolicar 2008-05-16 08:38 pm (UTC)(link)
Ah. I see. I think.

But we're talking about something importantly narrower than "changing their beliefs." We're talking about changing their beliefs along a specific kind of vector, which we're labeling "extremism." We absolutely don't expect them to change their beliefs to something less "extreme".

Though... hm. This gets tricky to talk about.

I think it's fair to define an "extremist belief" as a belief that significantly differs from the relevant community norms. As a community becomes more militaristic, pacifism becomes more extreme; as it becomes more secular, theism becomes more extreme, etc. A belief in democracy as the best form of government is extremist in some places and not in others.

How this differs from an "unpopular belief" I'm not entirely sure, but never mind for now.

So, we're saying groups all of whose members hold far-from-norm beliefs along a common vector tend to exchange them for further-from-norm beliefs along that vector, to a degree proportional to how far they were from the norm to begin with. Yes?

So... hm. When I put it that way, it's surprising -- I tend to expect regression towards the mean. Not in any mystical or teleological way, just as a consequence of unconstrained random walks.

Though, of course, the walk isn't unconstrained. A subgroup that self-selects based on membership in one pole of an oppositional pair won't have the same norm as the group as a whole... duh.

So, OK. Right there with you.

Perhaps incidentally, I'm not really sure what an "extremist tendency" (as distinguished from actually holding an "extremist belief"), other than the thing that we posit that one has just before becoming part of an extremist group.

[identity profile] kirisutogomen.livejournal.com 2008-05-16 10:13 pm (UTC)(link)
OK, now I see what the problem was. This is not a terribly good job of extracting a good summarizing quote from a 21-page article. I think it was selected more because of that cool phrase, "iterated polarization games," than for its value in conveying the heart of the argument.

I think that if I describe the two main contending explanations for group polarization it would make things clearer, especially with respect to what an "extremist tendency" is supposed to be.

The first explanation is based on the fact that usually people like to be near the center of whatever group they're in. If my "natural" political position scores 40 liberalism points, and the center of my society is at 50, I am likely to be more comfortable at 41 or 42. If however I restrict my environment to a group whose center is at 30, I'm likely to go to 39 or 38, which is more extreme relative to the society at large (still hanging out at 50). So it doesn't necessarily require that every member of my restricted circle be at less than 50, just that the center is less than 50.

The iterated part comes in because by me moving from 42 to 38, I have helped to move the center of my group to 29.8, and the next round of shifts will be even more biased lower.

The second explanation, which I personally find somewhat more intuitively appealing, is based on the fact that there are usually several different arguments in favor of any given position, and that any single person will only have thought of a few of them themselves. If I believe that SUVs are evil for some safety reason, in regular society I will also be exposed to various opinions both for and against SUVs, but in a restricted environment composed of mostly SUV-haters, I will encounter a bunch of unfamiliar arguments bolstering my existing position, as I talk to someone who hates SUVs because they increase our reliance on foreign oil supplies and then someone who hates them for contributing more greenhouse gases and then someone who hates them because they take up more parking space, I'm likely to incorporate these confirming arguments into my own beliefs, thus making my position on SUVs more extreme. Note that in this case we could start with a group of people every single one of which starts at 40 SUV points, but that as we absorb each others additional arguments, we all move down together.

So what's an "extremist tendency"? In the first explanation, it's just being somewhat away from the norm, while in the second explanation, it's having reached a tentative belief based on just one or two reasons.
dpolicar: (Default)

[personal profile] dpolicar 2008-05-17 03:06 am (UTC)(link)
Yeah, that's more or less where I ended up, but the context helps. For what it's worth, I'm inclined towards the second explanation, for very loose definitions of "argument in favor of". (So loose, in fact, that I'd prefer to say "statements that reinforce.")