The Test of Intelligence

In reading through Gazzaniga et al.’s Psychological Science textbook, I was struck by this line about intelligence tests:

“tests generally favor those who wish to do well.”

Obviously.

But the statement is interesting when dissected. I tend to think of intelligence—or, especially, Intelligence Quotient—as a relatively-static measurement of reasoning, awareness, and attention (along with a dash of test-taking wherewithal). If taken multiple times at the same age, we ought to test out to the same IQ.

Yet it’s odd to think that intelligence depends on intention. Do I intend to do well on this SAT? All right, then I’ll study for it. Yes, so I’ll read the questions carefully. Absolutely, so I’ll answer everything to the best of my ability.

But clearly this is a piece of intelligence. It’s not the only piece, since environments, disabilities, family, socioeconomic status, and nearly every other human quality affects performance (and, by extension, “intelligence”). But desire, effort, and preparation are still crucial elements.

It still sounds strange, however, to confess, “I mailed in my math quiz today because my intelligence was low.” I tend to want to differentiate between effort and intelligence. I could have done better, thus my intelligence should rate an X rather than a Y. Really, though, is wishing to do well a critical part of intelligence?

This creates a problem with the standard intelligence schema. For instance, we know Einstein was intelligent. This is a given. Unarguable.  Big brain, big formulae, etc. There’s a common belief that he did poorly in school, though I believe there are holes in this theory. Regardless, I’ll bet you he screwed up at least one test/class/subject throughout his life. And on that day, he was stupid: he had low intelligence.

But if Old Albert was stupid sometimes, where does that leave the rest of us peons? This problem may be one of the reasons researchers have expanded the idea of intelligence far beyond simple IQ tests of verbal/logical reasoning into the realms of music, art, even throwing a football. Einstein had plenty of intelligences, which all added up (in some esoteric formula only he would likely understand) to his being Intelligent (with the capital I). And this bails us all out: if I’m not great at math (or, importantly, if I have no desire to learn and do well in math class), I’m saved from stupidity by my love of the harp. This is the kind of reasoning that allows us to imagine that some kids are “too smart” for school, and “too bored” to do well in structured work or educational environments. Homeschool, studying abroad, or other alternatives to standard classes are often pitched as possible ways of alleviating these problems, fostering intelligences in unconventional ways. Perhaps they do—I really don’t know.

One problem is that the word “intelligence” (like “modern,” “new,” “talent,” etc.) has gathered a massive cultural cachet. Thus everyone wants to co-opt its meaning for their own agendas. Now we have Emotional Intelligence, even, alongside Color, Rock, and Stick Intelligences (I’m developing these latter ones). The clutter accumulating around “intelligence” makes it meaningless. By attempting to cash in on the weight of “intelligence” we’ve diminished it.

I think we should toss the whole concept, at least on the personal level. Any measure or assessment of ‘intelligence’ has external purposes (to researchers looking at big-picture trends) but almost no useful internal or local uses, I’d argue. IQ is useful in considering group averages and correlations over time to assess society, education, and economics, certainly, but it does worse than nothing for individuals, especially children, I believe. Carol Dweck’s research shows that encouraging children based on effort rather than on ability emboldens them to try new things and take on challenges rather than staying safely within the intellectual boundaries they know.

So tell children they are good at trying, rather than that they are intelligent.

Personally, if I hadn’t been so worried about screwing up and proving myself unintelligent—carrying a mantle that many bestowed on me, with the best intentions—maybe I would have committed to learning a musical instrument, or kept playing baseball, or majored in math, or taken on a trade. By being labeled early on as Intelligent, I had one path open to me, the route to the Ph.D., the indicator of (academic) intelligence. Clearly things might have gone worse, but I still wonder sometimes what else I might have chosen had I been brave enough to venture beyond what I knew I was “good at.”

What’s done is done, but I’m applying this theory to the rest of my life, and to my (hypothetical) children’s lives. Intelligence is a useful marker, so long as we don’t water it down to uselessness. But it’s much more useful for researchers than individuals. Encouraging effort rather than establishing essentialism allows oneself and others to achieve the most we possibly can, regardless of intelligences. We’d be better off focusing on that.

The world may not be a meritocracy—some are certainly better off than others, due to no fault of their own—but we benefit as individuals from acting like it is so. Socially we should work for equality and to support those without the opportunities of their fellows. But every individual must believe that he or she will succeed by trying, by working, by exploring. This is an intelligence worth teaching. This is what we must “wish to do well.”

Via Dolorosa

Christianity has about the worst sales pitch: follow your Maker not in His glory or power or wisdom or wealth, but in His sufferings. Which is probably why we tend to dress the gospel up a little bit—because who wants asceticism? But until we abolish the cross and find a more amenable symbol, suffering remains at the heart of Christianity.

I’m not talking about persecution, although that discussion exists. I’m talking about the personal religious experience rather than the corporate.

W. H. Auden wrote about Pieter Bruegel’s painting Landscape with the Fall of Icarus in his poem “Musee des Beaux Arts,” musing on the poem that:

About suffering they were never wrong,
The Old Masters; how well, they understood
Its human position; how it takes place
While someone else is eating or opening a window or just walking dully along;

Auden’s poem—as well as Bruegel’s original work—illustrates well both the significance and the ordinariness of human suffering. But what both artists depict is the suffering that arises from the accident of being human, the suffering of a fallen world.

Christianity’s suffering—sharing in the sufferings of Christ, as Paul put it (in a strangely seductive phrase)—is not an accident. It’s the choice to sorrow, to sacrifice something.

So can you follow Christ without suffering? It seems like it: very few people choose the ascetic life anymore, and I assume that kind of self-denial isn’t a prerequisite for Christianity. Of course we still experience suffering, but that could be attributed to being human. We could suffer perhaps by tithing or using time or energy to serve others. But that is awfully abstract.

There remains a streak of masochism in Christianity, but I’m not interested in the enjoyment of suffering. The point of suffering, to me, is that it hurts. Job never thrilled over his compounding losses. Nor is Christianity meant to be sadistic: we have no right to make others suffer in Christ’s name. Suffering largely hurts because it’s beyond our control. It’s the sacrifice of both agency and pleasure.

And despite the many good things I believe do arise from following Christ, it wouldn’t mean much if we sought God simply through programming. We are driven to seek pleasure and control, and Christ did the exact opposite. And suggests we do likewise.

I believe in God in part because He doesn’t make sense as a human invention: He’s the opposite of everything we are. Christ’s suffering opposes everything He is.

Keep the cross in Christianity. To suffer is human, but choosing to suffer is divine.

Daylight Saving Time

DST demands a question: do we prefer to see people on the bus better going to work, or going home?

Mornings are depressing enough you probably don’t want to see your fellow passengers anyway.

But evenings you kind of just want the peace of mind afforded by a darkened metal tube stuffed with smelly strangers.

How do you decide? Well, luckily, as with every other crucial decision, you don’t have to decide: that’s what the government’s for! It’s also for public transportation.

One way to tell time is by cat sleep cycles. It’s a way, but it’s not a particularly good way.

We got a new kitty: Calla Lily. She looks like the flower so I insisted we name her accordingly.

It is kind of hard to say, to be honest. So she’s going to get variations on that name mixed with “Kitty,” which, it turns out, is her surname. She answers to it, at least. We got her to be a living statue in our house. A sphinx. She obliges and is certainly full of riddles. Where did you come from, where did you get that painted face? She looks past us and smiles. I’ll have to wait for the details.

She makes things interesting already. Turns out there is a ton of stuff to do under the futon. And also if you put your face in the door crack you can get high. Or something. Cats.

She doesn’t seem like a destroyer of worlds. Which makes me worry a little bit. Because when she does get pissed, something huge is going to go down.

Aren’t all cats enigmas? I think so. That’s why they’re fun to keep around. Plant them in your house and let them grow into funny little mysteries. They’ll drop a few clues now and then, usually. But Calla is cagey and seems less inclined to let me in on any of it.

That’s all right, we’re from different universes. But I’m happy she can crash in this one with us for awhile.

Maybe she can help me solve DST.

Acting at Life

A while back Netflix was offering me few alternatives so I settled on Johnny Knoxville’s performance in that stirring paean to geriatric irresponsibility, Bad Grandpa. Knoxville acting as gross old man Irving Zisman isn’t particularly memorable, but it is a bit of a departure from his Jackass work in that he plays one character throughout, rather than mixing it up or simply being himself. The Jackass TV series and films still set the pattern for this movie (as did Sacha Baron Cohen in Borat, etc.): a performer (performers, in the case of Bad Grandpa) remaining in character interacting with people who are unaware they are being filmed until afterwards. Thus actors are acting with actors who don’t realize they are acting (or aren’t acting at all, depending on your definitions).

The result is a film made up largely of actors who don’t realize they’re making a movie. So a sort of “reality” film. Except the situations aren’t real but contrived—it’s just that 90% of the folks we see in the movie believe them to be real (with funny/offensive results, as can be inferred).

It seems odd to make art (or “art”) out of individuals who have no idea they’re making it. Of course, if the actors-playing-themselves (unawares) had refused to have their images shown in the film, we probably wouldn’t have seen their actions. So in a sense they had some agency in their acting—but not enough to actually be able to change their reactions on film, only to allow them to be shown or excised.

What’s even stranger to me, though, is when everyone involved in a film—especially the director, producers, and principal actors—do not seem to know they are making a movie. This has happened on many occasions, I propose: Troll 2 and The Room come to mind. These are movies that we have to believe were made as serious, “artistic” movies, rather than created as some kind of tongue-in-cheek Sharknado spinoff. These are movies that fail as serious films but succeed as hilariously-inept spectacles. Thus, the directors of these films—like the unwitting actors in Knoxville’s or Cohen’s movies—actually had no idea they were making the movie that resulted from their progress. They believed they were creating Movie A—a horror film, or a serious drama. But the audience who saw these movies saw immediately that there was no Movie A, but only Movie B—a hilariously schlocky nonsense-fest.

This is all perhaps a cautionary tale: what stories do we create that we won’t be able to see? Is all of our seriousness just comic? Probably. To see humans as anything but comical and revolting would probably take a very sympathetic Audience.