Monday, December 24, 2012

Books for My Holiday Season Ritual


I have a holiday ritual I engage in every year after I submit final grades for my Fall semester classes. I do this roughly on December 17 or so, give or take a few days.

I know that mid-December is really late to start thinking about the holidays. But the truth is that every year it gets a little easier to ignore the frenzy, and with submission of final grades (not the Thanksgiving Day Macy's Parade), I pass through my own private gateway to the Christmas season. Then I begin my holiday ritual: I start rereading a favorite book for the holidays. Yes, rereading a book during the holiday season is my ritual. It doesn’t take away from family time, at least from what I've heard. It does take away from Internet use. But that’s okay. And the books I’ve read in my ritual have given a definite feeling to my holiday season. In the same way that a friend of mine looks forward to the November Novel Writing Month with joy, I look forward to this time of personal retreat, reflection, and seeing things in an old book that I didn’t see before. 

My Criteria
My holiday book choice has simple criteria, at least that I'm aware of. The book has to be one I’ve read and enjoyed in the past, or at least thought at the time of first reading that it might deserve rereading someday, should I live long enough. I may have taught it in a class or been taught about it in a class, or it may be a book a friend recommended.

As a tangent, I should add that many of my favorite books have come from a friend’s suggestion. This brings part of my enjoyment when I reread during the holidays: I will also think about the friend who recommended it, about their influence on my life, and where they might be right now. I will think about other friends from the season in my life when I first read the book.

It usually is also a book I've owned for a while, is somewhat used, dog-eared even, and has my own previous annotations. It may have to do with Christmas or the Christian faith, but it doesn’t have to.

These criteria can be considered quirky, even sentimental. But here are a few of the books I've reread over the years: Various Sherlock Holmes stories; most of the Narnia tales, or Lewis’s Out of the Silent Planet; Emily Dickinson's poetry; Chekov stories; Tolstoi; Wonderful Flight to the Mushroom Planet (a book I read in fifth grade); Dostoyevski’s Notes from Underground. 

This isn't a complete list, and it isn't a scientific sample. In looking at it, I realize that it involves mostly reading for pleasure. It seems to suggest I favor male authors. And it reminds me of one last criterion I can think of for it: The book should hold out a world I’d like to enter again. 

I find the English fantasy writers and the nineteenth century Russian novelists especially good at this last standard and highly conducive to holiday reading.

This year, I’m rereading a Charles Williams novel called War in Heaven. It’s not brilliantly written. But as it is based on the premise that the Graal of Authurian legend has been discovered residing in a small country church outside London, the "what would happen if" possibilities are inviting. 

Home for the Holidays 
This is my own private ritual, reading. Over the years, I’ve come to feel about the authors and books I reread during the holidays the way that some feel about Scrooge, Charles Dickens, James Stewart, and the Peanuts gang.

I would love to hear of other books you might enjoy, but also of other holiday rituals you might have.  

Happy holidays. May you have rest, recreation, and good reading.  

Labels: , , , , , , ,

Friday, December 14, 2012

“Fiscal Cliff” or “Slope”? What’s in a Metaphor Anyway?




When I was a graduate student, one book I frequently heard being mentioned by my professors and peers alike in a variety of contexts was Metaphors We Live By. Written by George Lakeoff and Mark Johnson in the late 1970s, this work of practical philosophy examines the way that metaphors shape collective thought and action. In particular, Lakeoff and Johnson have much to say about how war serves as a metaphor in debate and gender politics, as in “the battle of the argument” and “the war between the sexes.”

As a lover of poetry, I’ve long understood that metaphor and simile are not ornaments, frivolous scrolls of language to make our manuscripts pretty. Rather, a metaphor is thought itself. So Lakeoff and Johnson’s discussion of argument as war, which includes the idea of “blowing an opponent’s argument out of the water,” might explain why many people view argument as fighting and generally hate election seasons.

I also found it clarifying that a culture that perpetually refers to relationships between men and women as “war,” a peculiarly masculine view, I might add, is not going to view friendship between men and women as possible. And they won’t be optimistic about marriage as being anything other than détente.  

Two Ways of Metaphor
I teach, write, and do research in two subdivisions of English, in Creative Writing and Composition Studies. These two fields encourage us to think about metaphor in different ways. In Creative Writing, metaphor is classified as either clichéd or new. Clichéd metaphor might include “love is a rose” and we are “blank slates” at birth, which I believed until my wife and I had our first child. But unless it is expressed in a character’s speech and therefore seen as revealing something about that character, a cliched metaphor should be pulled from fiction or poetry like so many dandelions. Because it is so commonplace that we no longer think when we use it, clichéd metaphor is dead weight. In contrast, new metaphors (I hope my references above to dandelions and dead weight will count) are valued for their power to make us see experience in new ways.

But as a Composition teacher, I also view metaphor as a rhetorician would—the way Lakeoff and Johnson see it in their book—as doors into the way a culture will frame issues. Commonplaces, clichés, for all their dullness in creative writing, are in rhetoric doorways that politicians always enter, or at least open and think about. At any rate, they ignore them at their peril.


Metaphors and Issues: “Fiscal Cliffs”
If Lakeoff and Johnson were writing their book today, they might find real interest in a metaphor now getting a lot of media circulation. I refer, of course, to the “fiscal cliff” that is, like the Mayan calendar’s presumed prediction of end of the world, fast approaching.
The political consultant who came up with this one is a rhetorical genius.

Like most voters, I’ve become  concerned about what is going to happen at the end of the year if Congress and the President don’t work out some agreement about those pesky Bush-era tax cuts set to expire. Do we raise taxes only on the wealthy, on everyone, or keep the tax cuts in place for everyone? 

Most of the media I hear and read will present the issue in these terms. 

The question not generally being raised in all of this, however, is whether or not the metaphor is true. We won't do this because we are looking past the metaphor already. And it is a metaphor. But perhaps it is time to ask this. Do the cuts or lack of them constitute a fiscal cliff? Do they deserve their own metaphor? And as Lakeoff and Johnson might ask, how does the metaphor shape the way most of us are thinking about this vote?
Here's how we are thinking. We are entertaining two probable scenarios--neither has been acknowledged as only probable, aside from one columnist I am aware of, Jack Shakely, who is admittedly left-leaning, and has suggested that the “cliff” in question might really be only a slope. According to Shakely's way of seeing, letting the tax cuts expire—all of them—might result in some problems for a few months. But eventually, the national debt would lessen and basic services would continue to receive funding. It may really be only a slope. Or a hill.
 
 
The Need for Certainty
Are we approaching a “cliff”? What would result from allowing the tax cuts to continue?
 
 
I haven't the space or the background to cast light on the economics behind this. But I can note that an organizing metaphor may have replaced our ability to have a discussion about it. A general unconsciousness about how we use language may be implicated in the creation of a false dilemma. Instead of real public debate (for some that would be unpleasant warfare), we are being subjected to a cliché—the "cliff"—and the submerged image of a battle of chicken (Who will be the first to blink in this political battle of the wills?).
 
 
Shakely’s scenario is, of course, like the others, also only probable. It may be a cliff, a hill, or a slope. We won't know until it arrives. 

 
And that presents the problem. We are not generally willing to mess around with what is only possible or probable. We demand certainty.
 
 
So we hold to our metaphors as our maps.
 
 
I give Shakely credit for at least reminding us that instead of a real cliff, it may really be another metaphor we are fast approaching.  



Works Cited


Lakeoff, George, and Mark Johnson. Metaphors We Live By. Chicago: U. of Chicago P., 1980.

Shakely, Jack. "Let's Take the Plunge." Los Angeles Times. 5 Dec. 2012: A17.

Labels: , , , ,

Tuesday, December 4, 2012

More Writing Advice: What are the "Basics" of Writing?

I teach a course in writing for teachers. In this course, I often hear about “the basics of writing.” My students interested in primary grade teaching are especially convinced that there are fundamentals students need to know before they go on to the complexities.

I am really not playing dumb when I ask, “What are the basics of writing?” I do want to know.

For students coming out of the American general education movement, the basics are always these: grammar, sentence structure, and paragraphing. These three areas are also what universities and colleges focus on in their “basic” or “remedial” writing courses. The assumption is that their students didn’t get enough of these when they were younger. Sometimes penmanship is thrown in as a fourth basic, though this seems to be declining in importance with the rise of laptops. But regardless of whether we count three or four of them, the remedy to the problem is always seen as going back to them, as though writing is a building and it needs a good foundation.

I like the idea of foundations, but I do think the metaphor only goes so far before it stops being useful or clarifying. So I prompt my students. Is imagination a basic of writing? I ask. How about audience? What about story? What is a basic, and what is not?

It should be noted that grammar, punctuation, sentence structure, and paragraphing are not aspects of composition. They are aspects of language arts in the lower grades and linguistics and traditional, prescriptive grammar in the higher. They should be taught, but when we teach them, we are still not teaching writing. Teaching writing is something more akin to problem solving. When we write, we begin to reflect and discover, make connections, learn what we are really thinking. We weave. One of the ancient meanings of composition concerns the idea of weaving, of bringing things together into a whole. Compose.

Still, my questions leave most of the people I voice them to cold. Surely I’m missing what is obvious. But I don’t think so. Imagination, a quality and habit of mind and heart children lose the longer they are in school, may be what we need first. So I press for it anyway, in spite of the culture around me that cries for “basics.”

I propose that it takes imagination to project how another person, quite different from us, will respond to our ideas. Children are good at this. We can play games with them about this, and they will go along with us.

To teach writing, then, is to engage the imagination. It is to engage the ear, as with music. It is to engage the mind and the heart. It is to explain, certainly, but more often it is to proclaim and believe and doubt. It is to reflect, sometimes deeply, on why we are here, and what we should do now that we are here.

These, I argue, are the basics of writing.

My students will not agree. My culture does not agree. In a culture where science and exposition are the priority, I stand guilty of fantasy. Writing is at the bottom like brushstrokes or like learning musical stanzas. Writing is, like science, the uninteresting mechanical side of dullness. Certainly every watch maker, they argue, every scientist, every painter, every musician, learns the equivalent of this—the scale, the periodic table, the places of nuts and bolts.

But, I counter, the scientist dreams and fantasizes. The watchmaker dreams watches into existence.

Writers, I want to argue, do the same. As teachers, we could do worse than to coach them into their dreams.

Labels: , , , ,

Friday, November 23, 2012

General Writing Advice, Part 4

This fourth blog in a series (non-consecutive, I should add), will seem pedestrian to some readers, and I have to agree that it is. But the other day it happened to me again, and I decided that this needs to be said. By way of preface, let me add that because I both teach and try to produce writing, I am perhaps too aware that writing is not fully respected by most as a discipline, not like physics or math, and that writing is taught in the schools as though most people will not be expected to learn or use it.

Writing in the Red
The other day, a colleague from another department at our university heard me say that I was revising my textbook.

“You’re proofreading it?” she asked.

“No,” I said, “I’m revising it. I’m developing new content, adding a new chapter, and rearranging some of it.”

I was about to explain the purpose behind my revision, but she appeared suddenly puzzled, so I let it drop. But my friend’s inner-paradigm for writing was obvious: Write one draft, proofread it, and then hand it in. Further “writing” (read “revisions”) means you’re not a very good writer.

Old School
This is the pattern we are still taught in the schools. Write a first draft (this is called “sloppy copy” in elementary school), then proofread it, and then hand it in. Second efforts only involve cleaning up what is sloppy.

This has been the problem I’ve tried to address in my textbook for First Year College writing, which, as I’ve already noted, I’m revising now. General Study writing is what most people remember from their public school experiences. General Study writing emphasizes a curriculum of grammatical correctness, even though there is language in the California standards for teaching process.  

Red Pens in the Closet
The goal I’ve long had for my textbook is to make it a bridge from this reductive view of writing over to the way real writers work, which is messy and personal. A few of the chapters work to do this, but my goal is for the whole book to become a solid introduction to writing as a real subject. This is why I’m revising it.

Writing is difficult to teach and made even more difficult in high school where the curriculum is rushed to meet the standards, most of the time is spent on reading literature, and the classes are too big.

My students, persuaded by twelve years of General Study writing, get their rough drafts back, correct the few errors I’ve noted, ignore my comments in the margins for more thought development, clearer organization and transitions, and then don’t understand why their grade doesn’t go up. They are not interested in further exploration or discovery.

But that’s just the problem. How do you teach thought development? How do you teach seeing and reasoning, or the imaginative qualities that go into thinking about an audience beyond the teacher with the red pen? How do you teach these skills when all of your students and their parents don’t believe that they have anything to do with your discipline?

Old Habits, New Curriculum
The old ways of thinking die hard. Even when I help writing teachers to symbolically stop using the red pen and start emphasizing skills like invention and revision, the change is slow.

I can only speak for myself here. I remember the first time I was told that marking up papers in red wasn’t an effective way of grading, that I needed my students to attend to larger, more complex issues with their writing. So I got rid of my red pens, used green for a while, and then discovered that I was using green to mark errors with.

Old habits die hard.

 

 

Labels: , , , ,

Sunday, November 11, 2012

A Bit about Bureaucratic Tongues

Crafting phrases and sentences in bureaucratese is not one of my strengths. Oh, I can do it. I can write, say, “We seek a culture that advocates for excellence.” But I don’t have to feel good about it. To me, “Burt wiped the bread crumbs from his beard” is preferable as a sentence that actually says something in contrast to the ghostly, collective barb above. 

I would like to think that “advocating for excellence” says something, but think about what is actually being done and said. How do I advocate? Join another committee? Write an ad? Make my family uncomfortable? Put a bumper sticker on my car? Is that enough advocacy? Excellence is rare, of course, but people in educational circles speak and act as though it happens all the time. A few years back, a slogan on my daughter’s high school billboard read like this: “Excellence is expected.” I could see the fires of a concentration camp in that one. Excellence, that most rare of human achievements, is expected, and you will produce it? Or else? 

Indeed, on one academic committee on which I serve, I’ve heard “excellence” batted around so much that it has taken on the hue of a word like “awesome,” a word that once meant something like fear-inspiring but now means something closer to “being exciting in a dorm room sort of way.”

Even more surprising of late for me is hearing people I respect delight in the “crafting” of a vapidly-turned bureaucratic phrase. Again, as noted above, I serve on a committee for which we often have to craft documents and “statements.” The statements feel more like clouds, and the documents are meant to be all-encompassing and create wiggle-room. On finishing one of these phrases, the director of our committee praised us all for it. This is the kind of person who has probably spent too much time crafting mission statements for his church. As I was being praised, I felt uncomfortably like I was one of Sauron’s minions sending up smoke from Mordor.

Here’s the problem with all of this: It’s all around us now, and usually it’s accompanied by colorful visuals. As we simply accept more and more language that means nothing, I believe we stop expecting words, language, to actually signify. We become used to it. We accept it. There are serious moral consequences to this.

Saturday, November 3, 2012

Creative Nonfiction and the Ethics of First Person

I teach a course called Creative Nonfiction.

Most of the time, I have to explain to people, even the students in the class, what it's about. I think this happens because we live on a mental fault line. On one side, we enjoy the fruits of myth and story, but we're seen as purveyors of pleasant falsehoods. The other, inhabited by scientific discourse since the nineteenth century, joined by journalism in the twentieth, seems the more credible side.

Certainly, scientific discourse shows no sign of exhausting itself. Scientists seem secure in their position as explainers of reality. But journalism, with the rise of the Internet and partisan coverage, has been in crisis. As for journalistic objectivity, I like to remember that most American newspapers began as partisan publications in the nineteenth century; most towns had a Democratic paper and a Republican one. If talk radio, the Internet, and even most of those newspapers that remain today  are either tacitly or openly biased, this seems a return to journalism's roots, not a decay.




That doesn’t matter, though, because to most people, the fault line exists and is important. It influences our reading of most genres, even straightening out how we read the Bible. The usual terms—fiction (novels, short stories, romance and mystery) and nonfiction (science and history writing, news, instructional videos)—seem endangered as I begin to describe what goes on in my class.
Creative nonfiction, or as it is sometimes designated, narrative nonfiction, draws on fiction techniques—what we normally reserve for poetic writing—to recreate experiences drawn from memory. In using character development, dialogue, scene construction, symbol, and narrative summary to recreate our experience, we want our reader to experience what we have gone through. We also have an interest in “art,” in creating something new that hasn’t been before.  

Exploring a New Genre
I was first drawn to this genre in the nineteen-eighties as a new Christian because I realized that if I wrote a novel about the spiritual events that had occurred to me, no one would believe me. We all experience coincidences in life, but put them in a novel and they look unbelieveable. But when I adopt the first person persona of myself, whoever that might turn out to be, and stay true to what I'd written in journals and what other people remebmered, I might relate a piece of reality that startled me.
Even so, to the purists, this blending of “reality” with fiction techniques sounds as though we want to embellish on our lives, make them bigger, more important than they are. To this accusation, the narrative nonfiction writer will argue that she/he wants to gain insight into what happened.
Admitedly, there are problems with this kind of writing, not the least having to do with memory. As Mira Bartok notes in her award-winning memoir The Memory Palace, writing changes what we remember. In her work about her mother’s debilitating mental illness, Bartok employs the statement “I do not remember…” often. However, it is this admission that gives what she does remember, what she checks with others about, a certain credibility.
If the aim for us is to set down in concrete a record of what happened, then this hybrid genre will seem too soft. But if, like writers from Augustine to Montaigne to Patricia Hempl and David Eggers, our desire is to speculate, to explore, and discover, the attempt is worth a lifetime spent on reflection, writing, understanding.

Labels: , , , ,

Thursday, October 25, 2012

Feeding the Inner Caged Thing (Not Just a Halloween Blog)

I’ve been noticing something that I think is worth repeating.

What I spend my time thinking about, that’s what I seem to feed. If it is a problem with teaching, I eventually will work through it. If it is an obsession about the other candidate I'm going to vote against, or how someone has wronged me, the perceived wrong grows. A feedback loop seems to radiate.

This feedback loop can work for good or for ill. To cite an example of the former, for the last week—week 8 of our fall semester where I teach—I’ve been trying to get back into finishing a novel I’ve been working on. During week 6 of my teaching schedule, I went a week and a half without working on it, and when I returned to it, it was cold. I felt like I was fighting a zombie. I may have been lucky enough to write 250 words that day, and it was all I could do to keep myself from getting up and doing everything, anything but writing. But then something else happened. The next day, when I had less time to write, new ideas for it came to the fore.
This got me thinking about changing my writing behaviors. So for the past two weeks, I tried something different. I put in at least some time—even time for editing—every single day on the draft. When it came time for my four hours of writing on Friday, I was very productive.
So I’ve continued this, and I’ve noticed that the novel is alive again (which may be the wrong imagery here).

I've known this for a long time, of course, that writers simply write, as swimmers simply swim. The more you write, the more you write.
This also seems a principle worth noting about mental hygiene and admonitions about what to spend my time thinking about. So right now, I’m spending less time on Facebook, less time worrying about personal slights or about all my failures, and more on the world of my novel. I may finish it soon. I’ll let you know.

Labels: , ,

Thursday, October 18, 2012

Coffee and Other Concerns at Mid-term

Today’s topic is coffee—the writer’s drink.

Coffee is the drink of the pro. It is a statement of a maturing taste, of a character learning to handle bitterness. These are, for writers, two important issues—taste and handling bitterness.
The initial move, for most, is to add sugar.

That’s what I did the first time I decided to have a cup. But that doesn’t take away from the enchanting first cup, my first cup. It just means that my lessons took a bit longer, delayed by my prolonged use of sugar. And milk.

But maybe it was having the first cup placed in front of me in the Big Boy restaurant, the jukebox going and the truckers sitting at the counter. Maybe it was following my friend’s lead and pouring in the cream from the metal container and then the three packets of sugar and stirring it with a spoon and savoring this new milky, slick independence, no longer candy, but not yet coffee. Maybe it was the night outside, hidden deeply behind the lights of the street, the song playing on our jukebox in early reggae, “I can see clearly now, the rain is gone.”

I was sixteen and Nixon, seeking a second term, was getting young men out of Viet Nam, so maybe it was the strong probability of coming back again the next Friday and having it again.

This was the spot, a place of transition, a space to move forward, to begin to see my own story and where it was headed.
Today, when I see my students cramming at mid-term, it's not that I'm insensitive. I've explained the way to avoid this. And they do have the writer's drink, after all. I remember with strange fondness all those short nights in college at mid-term, with instant coffee at three in the morning, getting on with papers, reading, and exams, just as I remember those first Friday nights of thinking about the future, of getting out and drinking three or four cups before mid-night. 

Today, when I sit at the computer and begin a morning’s work, the coffee is there, dark roast, no milk, no sugar. I drink it all morning, and I like the taste, and it doesn’t seem all that bitter—what I used to avoid and try to sweeten. And the story continues.

 

Wednesday, October 3, 2012

The More Things Change...Rethinking the Most Recent Rumors of Literacy Crises

Most people seem to think about language the way they think about teen pregnancy. They think of it only as a problem. At least, that is how they sound.
English is in trouble, or so the argument runs. Eighth graders can’t speak or write in their first language, no one knows what a modal is, and the apostrophe is an endangered punctuation mark. (So is the comma, though its case is beyond saving.)
Add to these items, of course, the usual observations about teens and text messaging, Internet-chatting, movie-going and the like, along with their lack of reading, and it would appear that we are indeed staring down the barrels of a crisis never seen before. Usually in these arguments, language study is summed up in the rules listed in Strunk and White’s The Elements of Style. In this account, there seems to be little difference between learning to use “correct” English and getting the instructions right on a new toaster.
But I don’t think that we are in a crisis of the proportions currently being imagined. Rather, the more things have changed, the more they have stayed the same.
Those objecting to the way that young people use the language today should first consider the kind of evidence that is marshaled to prove that there is a literacy crisis. People who argue that students today are poorer writers than students were in the past are basing their argument mostly on what appears to be nostalgia. Or sometimes they will pit today’s student papers against a novel by, say, Ernest Hemingway or Katherine Ann Porter. (This is not fair, considering both writers had editors.) In fact, this argument is based on a lack of historical knowledge. And as we all are familiar with the maxim about ignorance of history, a remedy seems in order. Our ignorance of the past means that we might be doomed to repeat it, and this is certainly borne out in the ongoing literacy crises repeated over the last 150 years.
One such crisis came out in the 1870s, when the first Civil War veterans were admitted to some of the elite American colleges. Robert Connors and James Berlin both note that these students were unprepared, as were the colleges admitting them on more open terms. Some of these first generation college students could not write in Standard English. The new openness led to new concerns about literacy and the invention of a new college course, Freshman writing, the only general elective at Harvard.
As the United States culture continued to change, with immigrant populations, moves from rural to urban centers, and more people wanting to go to school, English departments engaged in serious debate throughout much of the 20th century on whether or not grammar should be taught and how it should be taught.
The result today is that it is possible to read some of the student papers they published in journals like College English to make their arguments that student writing was getting worse, and training in grammar should be the basis for all college-level writing instruction. Those papers are interesting, for as I read them today they lead me to the suspicion that nothing has changed. 18 year olds in 1926 seemed to make the same kind of sentence level errors that 18 year olds make today. Of interest is that these are the bad student writers that some teacher selected to clinch an argument about teaching grammar in 1926. It might be assumed that these papers are representative of the problems with usage, spelling, punctuation, grammar, and format typical of students at the time. The similarities between these jazz era student papers and the papers my students who struggle the most to write suggest that there has been no decline.
English teachers engaged in the debate about grammar and teaching writing for decades. But this debate has never been settled for the larger, general public, and when it comes to concerns with language, most of us tend to follow our initial convictions and rely on what seems the most intuitively correct evidence. This seems to be behind the assertions that poor student writing and language use is the result of TV and movie viewing, and more recently, texting.
Consider that in 1939, W. Alan Grove, an English teacher, wrote in College English that critics of the decline in literacy attributed the decline in student writing ability to “the comedians, sports commentators, and crooners of radio and movie.”
Again, it would seem that the more things change, the more they stay the same. When trouble seems to appear again, we seem ready and willing to round up all the usual suspects again. The trouble is, with these ongoing debates about a literacy crisis, we don’t seem to be aware that we are repeating the same episodes seen before. And sometimes the repetition ends in unfortunate programs like No Child Left Behind.
At the very least, it would appear that it is the literacy crises that should be questioned. All of them. We should pay attention to the comparisons we are using to make our points. Comparing the way young people use the language to adult novelists who have been intensely involved in language use for more years than the young have been alive is simply not fair. We might as well base a traffic crisis on comparing the way that 16 year olds drive to the way that their parents do.  
In language, we don’t see the unfairness because we don’t understand the developmental qualities of language acquisition. We take a positivist assumption that if material is covered in class—in this case, grammar lessons—then that is all that needs to be done.
If the question of correctness among our youngest users of the language were all there were to the story, I’d suggest we follow W. Alan Grove's suggestion and pass a law now that requires all sports announcers to at least understand the content of an upper level linguistics course. In fact, given all the "reforming" that has been done to all levels of education, all of it without attending to social forces, this might be the next step in what we call "education reform." Require anyone with a speaking role in the media to speak English as well as Jerry Seinfeld. If they can’t, they’re out of the spotlight. It would be as simple as that.
Sports announcers are generally annoying, for a number of reasons, and most of them get their jobs, not because of their command of the English language, but because they were a successful quarterback, or they are outrageously colorful, or because they help to raise TV ratings for their shows. But they will never face the regulations and upbraiding from the voting public that teachers get.
And finally, I agree that there is no reason to make them pay attention to their English.
 

Labels: , ,

Sunday, September 2, 2012

What's in an Opinion

“In my opinion,” people often argue, and then rarely stop to provide evidence.

Opinions are cheap, certainly. As the saying goes, everyone has one. Even so, what is not commonly understood about them deserves attention, especially with the double-trouble of another school year and an election season upon us.

An Opinion about Opinions
Consider the following example.

“In my opinion, President Nixon was very shrewd to abolish the military draft.”

I see the first phrase all the time in student writing. But is it necessary to tell the reader that I am offering an opinion? If I write simply, “Nixon was shrewd to abolish the military draft,” is it suddenly, magically no longer an opinion? Because there can apparently be more than one view on Nixon's motives when he did away with the draft in the early 1970s, it is pretty clear that I have to support my opinion, probably with an argument.

But I suspect that this is what many writers hope to avoid. Some students believe that they don’t have to give evidence for their opinions. Stating “this is just my opinion” seems to them, on the one hand, the equivalent of a footnote. On the other, it seems to suggest a view of knowledge that is deeply personal, based in experience, and drawn from parents, friends, and from living. Who can challenge that?

Most academics do.

Writing is Different from Speaking
Part of what is going on may have to do with confusion over the differences between speaking and writing. The conflicting advice my students often get about writing is interesting. “Just write naturally, the way you talk” and “Don’t write the way you talk” are conflicting examples. Certainly, the best stylists know how to approximate speech while remaining true to Standard Written English. They create a highly readable and engaging “conversational” style. But this takes a great deal of work and study. No writer “just” writes exactly as she speaks. A transcribed conversation is a messy, incongruous puzzle, even to the people who conducted it. They look back at it and wonder what they meant.
In matters of opinion, speech conventions and writing conventions differ widely. When talking, I can say, “But in my opinion,” and my friend, unless she is a university colleague and we are together serving on a committee, will not expect evidence to follow.

Genre Confusion
My students believe that what works in conversation—“But that’s just my opinion”—will work in writing to get them off the hook. They remind me of some of my Facebook friends who seem to think that posting slogans and cliches over and over again will eventually result in their readers having a sudden revelation of the truth of their views.
This is a very low view they hold of their readers--to think of them as only needing to be shouted at. More likely, they are waiting for evidence.
In casual speech, we may not want a friend to engage in a dissertation or a harangue. “In my humble opinion” seems appropriate. But in writing, “In my opinion” is usually a wordy flag to an unsupported assertion. As a reader, I might be charmed by a writer’s opinion, especially if it is a daring one. But I also expect to be charmed by the writer’s reasons for his or her opinion.

Labels: , ,

Saturday, August 18, 2012

General Writing Advice, part 3: Thesis and Outline

This past week, I helped to teach a workshop on writing at Azusa Pacific University. For four mornings, for four hours each morning, a colleague and I met with a group of wonderful people motivated to be effective writing teachers. These teachers came from many disciplines, not just English, where, the stereotype suggests, writing should be taught. Our colleagues from across the curriculum at APU, from biology and nursing to Communication Studies and Education, joined in discussions about what was and wasn’t working for us in teaching writing in our disciplines. I was honored and encouraged by everything that went on this week.

And I was struck by a few important issues. I realized that those bits of general writing advice—never use contractions, never use the first person pronoun “I,” and some of my own pet peeves, like never use the second person “you”—were all well-known, deeply entrenched, and taught in our various fields. Less known and practiced were principles of Composition process pedagogy—that good writing results from writing many drafts, and writing many drafts can often lead to students having fewer grammatical errors. In contrast, most students don't revise, and they don't really practice invention techniques that really work for them. They aren't really taught these techniques.

Thesis and Outline
What I discovered instead was that some general writing advice, especially the advice to have a thesis statement and write an outline before writing, is widely practiced.

This is what passes for invention: We mainly tend to focus on generating the content and organizational patterns of writing, but we ignore audience and the context of what we have to say. The word has gotten around that having or finding a thesis or a focus is standard number one for good writing. Every essay should have one. The trouble is that most of us force our students to write one—and their outline—before they have defined the problem they are interested in, and without really knowing what they are writing about.

The problem with writing an outline should be obvious enough. The outline is written before the paper. We write it and assume that it gives our essay its final structure. And then we write and discover ideas we couldn’t have anticipated before the outline stage.

This is why most published writers do a first draft—Donald Murray calls this the “zero draft.” Then we can work on it again and add our new ideas, return to invention as needed. I shared this week that I usually don’t write an outline of a piece until I have a rough draft. Then I write an outline, which reflects back to me the shape of what I’ve written, what I’ve emphasized, over-emphasized, left out, or not developed enough. The outline helps me then to rewrite and generate more prose.

From Ancient Traditions
The thesis is most interesting, though. In the ancient Roman and Greek tradition of Composition, called the progymnasmata, the thesis was often viewed as a question. Instead of the modern view of a statement, the ancients saw the thesis as a tool for exploring a problem fully.

We do not view thesis statements that way today. We do not invite students to explore, to ask questions. We don’t view writing this way either. We tend to see writing as reporting results, which should be unified around our thesis statement.

If this is so, no wonder so many students and teachers see writing as trivial.

I didn’t get that impression from my colleagues this week, though. And I look forward to the good things that are going to happen across our curriculum this fall.

Labels: , , , ,

Wednesday, August 8, 2012

Doubt

This week, I kept to a strict, almost religious writing schedule, writing every morning and avoiding interruptions until around noon. And I tried to keep the writing alive over the weekend by putting in at least an hour. In this unglamorous way, as of today, I’ve reached forty-two thousand words in a rewrite of a major project.  

And this week, I started to have doubts about what I was doing. No one, after all, has asked me to do this. Who is to say it will be taken seriously by others? Will an editor find it interesting enough to publish?

These are reasonable doubts. After all, most books don’t get published. Most people don’t care about what I care about. And usually the best entertainment is not reading some three hundred page novel about a confused man.

But there was one other issue, at least as I've learned to think about this. Some doubts are reasonable, and some can seem reasonable. And this time, when I got to the root of my doubts about my writing, I realized they were rooted in a humourous exchange with friends this week. I'd been alone writing, and then the exchange happened, and then the rocks cascaded. I realized how important humor is, and I thought my writing was probably too dry.

Doubts of a Nonbeliever
I’ve lived long enough with myself to know that when I experience doubt of any kind, I usually want to reflect for a while before I abandon ship. This is important because I experience doubts about everything I do. It’s quite amazing that I’ve ever accomplished anything that most reasonable human beings do by nature. I doubted that going to college was the right move for me. Then I doubted I would ever finish. Then I questioned graduate school. And I doubted my relationships. Somehow, I am happily married to a wonderful person, and I am a full-time college professor.

I’ve come to understand that doubts can be both very reasonable and rooted irrationally in my own insecurities. I continue to doubt the safety of standing on a railed deck twenty four stories over downtown Boston.

Doubt/Belief
Years ago, in my twenties, I was a nihilist. That is, I believed in nothing. And yet, I had occasional doubts about this nothing. Eventually, I had an experience that caused me to begin to question some of my fundamental assumptions to my belief in nothingness. 

Now, as a Christian, I find I haven’t answered every objection my old nihilist self still raises against faith. I have many questions about suffering. I’ve come to see that holding faith as a commitment can sometimes put me at odds with reason. Doubts lead me to think and to question. Joseph Bentz, in his book Tipping Points, documents many examples of people who became Christians and continued to live with and work through their doubts.

Working through my own doubts doesn’t always result in a deeper certainty. But it does result in continued faith.

Carrying On in Good Faith/Doubt
I suspect that some of this could be applied to just about any endeavor or project anyone is involved in, from stopping world hunger to promoting peace and justice. If there is a project we've come to believe in, it is perhaps human to have questions about it. But the questions are simply a part of the work. I think my doubts about my book this week were healthy. I’d begun to do what I always do in a long project, to lose sight of the fact that I was doing something for human beings who would not just go along with every quirky point, every over-long sentence, every undocumented, unsupported assumption I hold about the universe.

There seems today far too much certainty, or, as Wayne Booth once called it, modern dogma.

I'm settling in with my doubts. Those I had this week about my book allowed me to invite a few thoughts about my readers in so that I could include them in my process.

Some might consider me bull-headed, but the journey continues.  

Labels: , , ,

Wednesday, August 1, 2012

General Writing Advice, part 2

To continue the discussion I began last week about general writing advice, I offer the following narrative, which concerns two suggestions: that we need to allow ourselves to write badly, and we need to learn not to write generally according to set rules but to write for specific genres.

Playwriting in College
During my junior year in college, I took a writing workshop in playwriting being team-taught by a professor from theatre and the poet in residence from English. The course concerned writing for one venue, and week after week our scenes were read and commented on.

My first two scenes were miserable failures. 

Preparing to write my third scene, I jotted vague character notes. Then, a week before it was due, I got my notes out and improved on them. I went for a walk and then returned to my desk and the blank paper in front of me and began writing the narration to the scene, which established setting, characters involved, and background. Halfway through the writing of this, I read it back to myself as I thought my peers would have read it. And I was uncomfortable. I began combining the first two sentences there, and I felt a quickening, something clicking into place in the better sentence I wrote. Somehow, in that small change, I gained insight into the scene. I could, for the first time, see the end toward which my plan was leading, and I remember relaxing and thinking, I can see the whole. I know who these characters are and what they want here. 

I emerged from that afternoon of writing with the sense that this was what writers had to do to be successful. They had to tinker. They had to be uncomfortable with their work enough to feel out of harmony with it, and they had to be willing to change it. The next time the class met, I found in the reading of my work that I was right. My scene seemed to communicate what I meant it to.

What this All Meant
I’ve since tried to think about this. The tinkering I did with my notes and those first sentences was perhaps not just tinkering. I suspect it was a way of getting my subconscious mind to move into and sharpen the task. And my discomfort with my first lines was not something I had ever paid attention to before. I would just go with it. Neither of these habits were in line with what I was ever taught to do. But I saw it that day as I began mulling over things. Certainly, a larger influence on my writing that afternoon did have to do with the way that I was suddenly understanding the work of creating knowledge in my field, and it had to do with audience, with background, with the community I had studied with, and my deepening understanding of the particular genre I was working in. It might seem that I made these connections on my own, in isolation, but I am convinced that the community of my classroom and the mindset of my teachers also influenced me. In my solitude, the voices of this workshop formed an important influence as I was encouraged to change some fundamental behaviors. As we were required to read widely in drama and to see as many plays as we could, I began to read and think as a writer of drama—in one sense to think almost solely in terms of how a conversation really sounded, and how it was the only vehicle in a play or a script for everything else that was to unfold on a stage.

The Emergence of What Can’t be Counted
This writing experience marked the first epiphany I had about writing that involved invention. Suddenly, everything I’d heard about what it took to be a writer was taking place with me. Invention seemed to be not only a technique for generating ideas and thinking about audience. There was also a merging of identity, practice, and knowledge that came from the coaching and guiding I was getting in class.

I never found this in the creative writing magazines or books I read through later. What I saw and continue to see in the textbooks about the teaching of creative writing moves between two poles. The first concerns technique, which most agree can be taught—point of view, character development, plotting, and sometimes theme. The second is subjective, even intuitive, and remains mysterious. It certainly remained outside of the lectures and conversations of my literature professors. It had to do with an uncertain mix of a writer’s ideas, background, character, motivations, insecurities, and potential audiences, all of which are uncertain at best, and unquantifiable.

It was remarkable that in the confines of that one class, the unquantifiable and uncertain became manifest, became quite clear.  

I would be interested in hearing other stories like this, and not just about writing. I would love to hear of others who have had moments of epiphany when working on something that was challenging to them.

Labels: , , ,