You look, but you do not see, grasshopper


Yesterday I completed the first Kung-Fu mind game with the freshmen. In our first-year program instructors are allowed to chose the content.  My section focuses on deception, specifically lies, magic, magical thinking and con games.  The real focus, however, is helping 18-year-olds ramp up to university-level standards for critical inquiry and writing. 

In class yesterday we went over the opening chapter of The (Honest) Truth About Dishonesty by Dan Ariely, which covers some fascinating research on why and how we lie.  Chapter one is all about poking holes in the S.M.O.R.C. (the simple model of rational crime).  This is our default view of why people lie.  It holds that the decision to lie or cheat is simply a cost/benefit analysis. 

In other words, when we are presented with a chance to gain an unfair advantage, we weigh the risk of getting caught against the reward of getting away with it.  If the odds are decidedly in our favor, most of us will cheat or lie.  This--or a version of it--is usually what students come up with when I ask the them to produce a theory for when and why people lie. 

After we put their theories on the board, we overviewed some experiments that really torpedo the SMORC.  Researchers gave a group of people puzzle matrices containing 10 solutions and they asked participants to find as many solutions as they could in a set amount of time. For each right answer, the participant would receive a dollar (or $10 in some versions). The average number most people could solve was four.  

Okay, so what would happen if you gave people the chance to lie about their results and reduced the risk of getting caught to zero?  My students predicted a large increase in cheating.  But that's not what happened.  In one version of this experiment, researchers had participants first shred their puzzle sheets and then self-report their scores.  Actually the shredder wasn't really shredding and, besides, the researchers already knew the average would be four correct solutions. 

To be sure, most people did cheat (almost 90 percent), but they only did so by a little, not a lot (an average increase of two answers).  Even when primed beforehand with the false idea that the average participant got seven solutions, people would still only cheat a little (6 rather 7, 8 or 9).  Cheating and lying were happening, of course, but something other than SMORC was clearly taking place.

So I asked my students to come up with new theories to account for these findings.  Eventually they arrived at something similar to what Ariely proposed.  People will cheat if the risk is low, but they still want to think of themselves as basically honest people, so they won't get too greedy.  In other words, our line between honesty and dishonesty is not distinct.  Most of us operate with a fudge factor, a gray borderline that separates honesty from dishonesty.   We even allow ourselves to cross into this gray zone from time to time, but we also want to retain the belief that we are basically honest people.  Indeed, when researchers reminded people before the experiment that they were on the 'honor system' to report their scores accurately (and when they had them sign a pledge to do so), lying about performance dropped considerably.

After going over all of these results, I asked my students if they thought the"fudge factor" theory was better than SMORC in describing why and when we lie.  They all agreed.  It was more accurate.  Students said things like, "He's right in a way.  I would feel bad if I said I got 10 solutions" or "Most people want to be good, but we all get tempted now and then.  Doesn't mean we're bad people."

Fine, wonderful, great work, everybody.

Next slide:  "Let's imagine that we went from offering As, Bs, Cs, etc., to offering cash payouts for top performance.  Instead of an A in a course, you receive $1,000, a B gets you $250, a C $75.  The money would be paid to you in cash at semester's end.  If we switched to this system, would cheating at this university go up or down?  Get into groups, talk it over and give me your prediction and hypothesis."

Result: they went right back to SMORC.

It did not matter that they had just seen evidence that SMORC was problematic, or that I had reminded them the risk of getting caught in the cash-for-grades scheme wasn't zero.

Did.
Not.
Matter.

SMORC it was.

I say all the time that mental models change slowly.  Students can look right at evidence, spit it back at you, explain it to you perfectly, but when you ask them to think with it they haven't moved at all. So on Monday I'll walk into class with a puzzled expression and say something's been eating at me all weekend.  I just can't figure it out.

"Last Friday you guys told me the "fudge factor theory" was superior to SMORC in accounting for why and how we lie.  But when I gave you the hypothetical about money for grades, you reverted right back to the theory you had just discredited.  I don't get it.  What gives?"

Then, and only then, will we discuss the difference between surface and deep learning. 

Hai-Yaa!




Comments

Anti-Dada said…
Do you think a part of the fudge factor is also motivated by not wanting to go too far outside the range of what might be probable or possible for any given individual? In other words, if I fudge from the average of four to five or six then I'm merely above average rather than exceptional and it's thus less likely that I'll draw attention to myself. If I fudged to a 9 or 10 that would naturally attract more attention and by being in the spotlight it might become obvious that I'm incapable of scoring anywhere close to a 9 or 10. I'm just curious if that might be an additional aspect of the fudge factor in addition to self-conceptions of honesty (and modesty, I guess, to some degree). Whatcha think?
Professor Quest said…
It's a good question. The researchers controlled for it, though. They primed participants with the idea that seven was the average reported score, but even then people wouldn't cheat big. They still only upped their score by approximately two. In effect, they were given the green light to go big, but something held them back. Add this to the suppresive effect of honesty reminders or pledges and I think it's clear that people are vested in maintaining their orientation toward honesty; they just have a gray zone they can enter that they don't think of as too dishonest. Of course these experiments were on average people, not Wall Street brokers or no money down real estate agents. Results may vary
T.J. Brayshaw said…
An alternative hypothesis (though maybe not likely) is that the students think there's a fundamental difference between the two experiments. If they go that way ("I don't know. It's just, um, different."), maybe you could ask them to try to very explicitly articulate WHY they think it's different. I find sometimes my students finally "see" that they aren't really thinking critically once they're asked to very clearly defend a statement, thought, etc. that they previous "felt" very strongly. (It might also be an interesting way to lead to a discussion about self-deception!)
Anti-Dada said…
This response has nothing to do with cheating. Instead, its focus is on self-conception. I was partaking in a philosophy group and the moderator who designed the group gave brief 15 minute presentations before opening up the next couple hours to discussion. After about three months he let the group know that he thought it would be a good idea for members of the group to give presentations, alternating each week to allow another member to make a presentation. The rules were that the presentation had to be related to the subject matter; we started with predicate logic but had shifted to Kant's critique of pure reason. During one of the discussions the issue of what rationality was and how it differed from reason. I and a few others were of the opinion that rationality is an incoherent concept, meaning too many contradictory things to be useful. For example, some use it interchangeably with reason while others claim there is a distinction, but in the latter cases the differences always ... differed.

Anyway, by then we had shifted to alternating presentations each week. One woman who had an undergraduate degree in philosophy and a master's in educational theory decided to tackle rationality: She had been one who had claimed that rationality existed as a coherent concept. She was vague in her answers about what rationality was so I (and others) were intrigued by what she might present. She went through her 15 minutes using a white board and a marker to create concept maps and bulleted lists, but she had trouble wrapping it all together into a coherent whole (which is what I and some others had basically been arguing). When her presentation was finished the group all but pounced on her. Not maliciously, of course, but with intense critical scrutiny. It was a well-educated group in their 30s and 40s and they took thinking seriously. None of the questions or arguments being made against the points of the presentation were leveled at this woman personally. It was her ideas that we were dicing up, tasting, and spitting out.

However, as the debate raged she became more and more defensive and clearly embarrassed as her ideas, frankly, were sophomoric. She took this personally and her flushed red face showed it. We took a break and she stepped outside. The moderator went out to talk to her. When he came back inside he told us, quietly, that she was very upset and hurt. A few members said, "Well, it's a PHILOSOPHY group! We're not here to play paddy-cake. If you're going to make claims then you have to justify them in some fashion." I was a little more on the fence. I agreed with them on the one hand and yet on the other I understood how difficult it probably was for her given that she was out of her league with this group. I think she assumed because she had degrees in philosophy and theory that she was well-equipped to discuss philosophy. She clearly wasn't, at least not to the degree that others in the group were. I didn't want to chase her away, though, and stomp on her interest in exploring thought. On the other hand, she showed no signs of knowing how to participate in a group that challenged one another's thinking.

Anti-Dada said…
That was the last time she showed up. The moderator told us that she sent him a scathing email, blasting everyone in the group for being mean-spirited, saying something to the effect of "If this is what philosophy is then I want no part in it." Her ego had been damaged and, from my perspective, her self-conception of being knowledgeable about philosophy had been assaulted. In order to maintain her self-conception, she bailed (my opinion). I have said it before and I will say it again: discovery is undefeated when matched against belief. If faced with evidence that your beliefs are wrong, it's necessary to let go of those beliefs and reconstruct. I admit it's a devastating and humbling process, but it's far worse to continue believing that you have wings when everyone can see that you don't. If she had continued and humbly listened and learned, asking questions instead of making claims, she could have made progress. But self-conceptions, self-identity, they can be terribly fragile. Challenges who one is, even if they come indirectly, can really do harm. The again, that's why self-exploration is so important. An unexamined life is not worth living, right?

I guess my point is how much aversion to humiliation plays a role in avoiding putting oneself out there. A person really has to have a strong sense of self to be able to admit he or she is wrong about something significant, something related to one's identity. I was just thinking about that in relation to students who had the self-conception of being honest when they obviously weren't. A spectrum of honesty, I suppose. "I'm only this dishonest and that's within the socially acceptable range of dishonesty." In that case, students are clearly not establishing their own identity but rather looking for external clues as to who they should be. You know me well enough to know how distasteful I find that. It may be the way it is and always has been, but it doesn't make it taste any better.
Professor Quest said…
Hi T.J.,

I pointed out this morning that it was odd to me that they began with SMORC, saw evidence that it wasn't a good model, and then returned to it when confronted with a new situation. They saw the disconnect. Then we talked about the concept of deep learning (changed thinking when confronted with new data) and how mental models don't always change quickly, even for highly trained people. The important thing, I told them, was to try as best they could to build their ideas from the evidence, something I have been stressing in their daily responses. Freshmen are locked into the plug and chug paradigm. Here is the answer. Done. I'm trying to get them to construct ideas rather than regurgitate them back at me. So, after this discussion, I had them assess the effectiveness of college's approach to academic honesty (students sign an integrity pledge at the opening convocation and it's never mentioned again). They concluded that this was a lousy approach. Then they proposed revisions based on the research we have looked at over the past two weeks . They had to show me how their proposals emerged from research presented in the texts. It went pretty well. They actually had some good suggestions. And I'm required to cover academic honesty in the first-year seminar, so I got that box checked as well.
Professor Quest said…
Loq, like you I have mixed feelings about such incidents. On the one hand, putting your ideas out there for consideration invites scrutiny and you ought to expect that not all of it will be tender, considerate or affirming. On the other hand, what moral and social responisbilities fall on the critics to be present to the dynamic of what's actually happening? It's easy to fall back on the "this ain't beanbag" defense of one's actions. Like you said, no one attacked her personally, but the blushing and emotional reaction of the woman might have suggested--as it seems to have for you--that the she wasn't seeing it that way. I'm not a particularly religious man--as you well know--but I do have a great admiration for those scriptures like the "Woman at the Well" or "Mary and Martha." You know, the ones where Jesus kind of slaps the disciples upside the head and says, "You guys are playing by the rules, but you aren't really paying attention to what's really happening here."

Sure, she should have expected her ideas to be criticized if they weren't up to snuff, but there's law (I can't believe I am making this analogy) and then there's gospel and grace. I sat through far too many brutal critique sessions while in a writing program in grad school. I saw people smashed up. I got smashed up a few times. I always admired those workshop leaders with the perspicacity to attend to the person before them. It was gift.
Anti-Dada said…
I agree with you completely that there were plenty of clues that lightening up on the criticisms would have been best. I followed the lead of the moderator who did just that and tried, in vain, to turn down the heat. However, many of those present, intelligent and knowledgeable as they were, didn't exactly have the emotional radar necessary to detect discomfort in others. They were primarily analytical thinkers without rich emotional resonance. There capacity for empathy was limited. They were emotionally dense. The woman, on the other hand, had a very rich and broad emotional range but lacked the finesse and subtlety of thought necessary to function well in that particular group. Deficits on both sides. Perhaps a prime example of why there will always be division and strife in the world. ... I love the references to scripture, by the way. I agree with you, I wouldn't throw out the baby with the bathwater when it comes to the Bible. Still, funny to see you using gospels to make your point. Makes me giggle just a bit.
Alexis G. said…
Hi, I'd like to chat with you about first year seminars as mine is similar to yours. Please look me up (or reply here) to let me know how to get in touch with you. Many thanks.

Popular posts from this blog

Two Jars

The Betrayal of F. Scott Fitzgerald's Adverbs

Four Arguments for the Elimination of the Liberal Arts