The Test of Intelligence

In reading through Gazzaniga et al.’s Psychological Science textbook, I was struck by this line about intelligence tests:

“tests generally favor those who wish to do well.”

Obviously.

But the statement is interesting when dissected. I tend to think of intelligence—or, especially, Intelligence Quotient—as a relatively-static measurement of reasoning, awareness, and attention (along with a dash of test-taking wherewithal). If taken multiple times at the same age, we ought to test out to the same IQ.

Yet it’s odd to think that intelligence depends on intention. Do I intend to do well on this SAT? All right, then I’ll study for it. Yes, so I’ll read the questions carefully. Absolutely, so I’ll answer everything to the best of my ability.

But clearly this is a piece of intelligence. It’s not the only piece, since environments, disabilities, family, socioeconomic status, and nearly every other human quality affects performance (and, by extension, “intelligence”). But desire, effort, and preparation are still crucial elements.

It still sounds strange, however, to confess, “I mailed in my math quiz today because my intelligence was low.” I tend to want to differentiate between effort and intelligence. I could have done better, thus my intelligence should rate an X rather than a Y. Really, though, is wishing to do well a critical part of intelligence?

This creates a problem with the standard intelligence schema. For instance, we know Einstein was intelligent. This is a given. Unarguable.  Big brain, big formulae, etc. There’s a common belief that he did poorly in school, though I believe there are holes in this theory. Regardless, I’ll bet you he screwed up at least one test/class/subject throughout his life. And on that day, he was stupid: he had low intelligence.

But if Old Albert was stupid sometimes, where does that leave the rest of us peons? This problem may be one of the reasons researchers have expanded the idea of intelligence far beyond simple IQ tests of verbal/logical reasoning into the realms of music, art, even throwing a football. Einstein had plenty of intelligences, which all added up (in some esoteric formula only he would likely understand) to his being Intelligent (with the capital I). And this bails us all out: if I’m not great at math (or, importantly, if I have no desire to learn and do well in math class), I’m saved from stupidity by my love of the harp. This is the kind of reasoning that allows us to imagine that some kids are “too smart” for school, and “too bored” to do well in structured work or educational environments. Homeschool, studying abroad, or other alternatives to standard classes are often pitched as possible ways of alleviating these problems, fostering intelligences in unconventional ways. Perhaps they do—I really don’t know.

One problem is that the word “intelligence” (like “modern,” “new,” “talent,” etc.) has gathered a massive cultural cachet. Thus everyone wants to co-opt its meaning for their own agendas. Now we have Emotional Intelligence, even, alongside Color, Rock, and Stick Intelligences (I’m developing these latter ones). The clutter accumulating around “intelligence” makes it meaningless. By attempting to cash in on the weight of “intelligence” we’ve diminished it.

I think we should toss the whole concept, at least on the personal level. Any measure or assessment of ‘intelligence’ has external purposes (to researchers looking at big-picture trends) but almost no useful internal or local uses, I’d argue. IQ is useful in considering group averages and correlations over time to assess society, education, and economics, certainly, but it does worse than nothing for individuals, especially children, I believe. Carol Dweck’s research shows that encouraging children based on effort rather than on ability emboldens them to try new things and take on challenges rather than staying safely within the intellectual boundaries they know.

So tell children they are good at trying, rather than that they are intelligent.

Personally, if I hadn’t been so worried about screwing up and proving myself unintelligent—carrying a mantle that many bestowed on me, with the best intentions—maybe I would have committed to learning a musical instrument, or kept playing baseball, or majored in math, or taken on a trade. By being labeled early on as Intelligent, I had one path open to me, the route to the Ph.D., the indicator of (academic) intelligence. Clearly things might have gone worse, but I still wonder sometimes what else I might have chosen had I been brave enough to venture beyond what I knew I was “good at.”

What’s done is done, but I’m applying this theory to the rest of my life, and to my (hypothetical) children’s lives. Intelligence is a useful marker, so long as we don’t water it down to uselessness. But it’s much more useful for researchers than individuals. Encouraging effort rather than establishing essentialism allows oneself and others to achieve the most we possibly can, regardless of intelligences. We’d be better off focusing on that.

The world may not be a meritocracy—some are certainly better off than others, due to no fault of their own—but we benefit as individuals from acting like it is so. Socially we should work for equality and to support those without the opportunities of their fellows. But every individual must believe that he or she will succeed by trying, by working, by exploring. This is an intelligence worth teaching. This is what we must “wish to do well.”

Via Dolorosa

Christianity has about the worst sales pitch: follow your Maker not in His glory or power or wisdom or wealth, but in His sufferings. Which is probably why we tend to dress the gospel up a little bit—because who wants asceticism? But until we abolish the cross and find a more amenable symbol, suffering remains at the heart of Christianity.

I’m not talking about persecution, although that discussion exists. I’m talking about the personal religious experience rather than the corporate.

W. H. Auden wrote about Pieter Bruegel’s painting Landscape with the Fall of Icarus in his poem “Musee des Beaux Arts,” musing on the poem that:

About suffering they were never wrong,
The Old Masters; how well, they understood
Its human position; how it takes place
While someone else is eating or opening a window or just walking dully along;

Auden’s poem—as well as Bruegel’s original work—illustrates well both the significance and the ordinariness of human suffering. But what both artists depict is the suffering that arises from the accident of being human, the suffering of a fallen world.

Christianity’s suffering—sharing in the sufferings of Christ, as Paul put it (in a strangely seductive phrase)—is not an accident. It’s the choice to sorrow, to sacrifice something.

So can you follow Christ without suffering? It seems like it: very few people choose the ascetic life anymore, and I assume that kind of self-denial isn’t a prerequisite for Christianity. Of course we still experience suffering, but that could be attributed to being human. We could suffer perhaps by tithing or using time or energy to serve others. But that is awfully abstract.

There remains a streak of masochism in Christianity, but I’m not interested in the enjoyment of suffering. The point of suffering, to me, is that it hurts. Job never thrilled over his compounding losses. Nor is Christianity meant to be sadistic: we have no right to make others suffer in Christ’s name. Suffering largely hurts because it’s beyond our control. It’s the sacrifice of both agency and pleasure.

And despite the many good things I believe do arise from following Christ, it wouldn’t mean much if we sought God simply through programming. We are driven to seek pleasure and control, and Christ did the exact opposite. And suggests we do likewise.

I believe in God in part because He doesn’t make sense as a human invention: He’s the opposite of everything we are. Christ’s suffering opposes everything He is.

Keep the cross in Christianity. To suffer is human, but choosing to suffer is divine.

So You’re Saying There’s a Chance…

I’m thinking about optimism and odds.

Never tell us the odds, because they will always be in our favor.

I’ve conditioned myself to think that 50% are good odds. But they’re not, they’re exactly neutral odds. And if 50% is pretty good, then 33% isn’t bad, ¼ is doable, 1/10 is at least plausible, and 1/100 is possible.

But it’s not, really.

This is the academic job game: having come from that world, a world where maybe 1/12 or 1/20 (probably worse) humanities (and possibly other) PhDs get a tenure-track job, and maybe ½-3/4 or so of these folks actually get tenure.

 

But let’s backtrack a bit…

Odds of graduating high school (Age 25 and over):

https://www.census.gov/hhes/socdemo/education/data/cps/2013/tables.html

http://en.wikipedia.org/wiki/Educational_attainment_in_the_United_States

88/100. Not bad odds, at all! (9/10, would graduate again.)

 

Odds of earning a bachelor’s degree:

32%. Decent odds, presumably, but on the negative side (3/10, might graduate.)

 

Odds of getting a PhD:

2%. (1/50, probably not worth betting on.)

 

Odds of becoming a college professor with that PhD in hand (within two years+)?

http://www.theatlantic.com/business/archive/2013/02/how-many-phds-actually-get-to-become-college-professors/273434/

20% (1/5. Meh.)

 

So: odds of being a generic America student and becoming a tenured professor are: 0.4%.

In other words, don’t bet on it. One out of 250 people will do it. And, honestly, that sounds inflated. Part of it is rounding error. The real number is close to 1 in 300. But I can’t even believe that’s the case. Like I said above, if you’re working in the humanities, your odds are worse: maybe 1/10 PhDs in humanities get tenure.

At any rate, you say, at least it’s not random. Of course, real life isn’t random: odds give us an image, but it’s an abstracted one. But what if we’ve beaten some of the odds already? This isn’t pure chance, of course.

So let’s see: perhaps you graduated first in your high school (I did). 1/400, we’ll round it. And maybe you graduated first in your class in college (thanks for asking!): 1/4000, maybe. And maybe you capped it with a PhD (humanities, thanks): so perhaps ~1/10 (of those college graduates) managed to do that according to http://en.wikipedia.org/wiki/Educational_attainment_in_the_United_States.

Odds of acing the academy (that’s what we’ll call this): Your odds of coming in first place: 1 in 16,000,000.

Now I didn’t quite get first place in graduate school (3.96 GPA), but the odds are still probably similar.

 

What I’m trying to say (really poorly) is this: no matter how well you do as a student, you can’t ensure a favorable outcome, if you define a favorable outcome by a tenured professorship (fortunately, I don’t anymore, and I have found a different favorable outcome).

You can work as hard as you want, but you can’t guarantee it.

Full disclosure: I applied to maybe two tenure-track jobs. And I didn’t try very hard on those. It’s because I knew these were the odds, and at some point, you can’t keep betting on one-in-a-million chances. You can be one out of ten amazing graduates, and if there’s only three jobs, it’s not likely you’ll get a job. No matter how well you did before, how much you tried, how ward you worked, how truly you believed.

At some point you can’t beat the odds anymore.