Sunday

1 comments

10 Reasons College Students Shouldn't Plagiarize (And We're Not Even Going to Get to Ethics)

10. Free essays are very badly written. (Seriously, we're talking "F," "D" and "C-" territory.)

9. Paid-for essays are very, very badly written.

8. Paid-for essays are usually plagiarized. (Ironic, huh? Think about it: A site that will help you plagiarize is, what, devoted to lining its pockets out of a sense of decency?)

7. It is becoming easier and easier for instructors to check plagiarism. (I have access to Turnitin.com which can search not only the web but printed articles and books plus students' essays from across the country.)

6. Speaking of which—just because it took you a long time to find the essay, doesn't mean it will take ME a long time to find it. Think, "Google," folks. "Google."

5. You could write the darn thing in the time it takes you to plagiarize.

Whenever you cut and paste from a website, you cut and paste all the garbage from that website as well. I have spotted plagiarized papers because of all the little links like this one that were embedded in the original text and showed up nice and bright on a printed Word document.

I have also spotted plagiarized portions in papers because the font suddenly changed. Sometimes the font type changed, sometimes the size, and sometimes, the color. Gray is not the same color as black and many web pages come up gray when copied into Word.

The time it takes to hunt and search for all the embedded links, fix the font, and smooth out the paragraphs . . . you could have jotted down an essay that would at least have gotten you a "C" (see #1 above).

4. You think this sounds like you? In your dreams.

I am always amused (but not enough to pass the student) when the tone of an essay suddenly shifts:
Yo, I think marijuana should be like totally legal, ya know. Since the 20th century, most countries have enacted laws affecting the legality of cannabis regarding the cultivation, use, possession, or transfer of cannabis for recreational use. Many jurisdictions have lessened the penalties for possession of small quantities of cannabis, so that it is punished by confiscation or a fine, rather than imprisonment.
Yes, the second and third sentences are from Wikipedia.

The change in tone isn't always this obvious, but one of the give-aways of online material is how much of it sounds like it was written by a committee—a committee of colorless, humorless, mind-numbing people. Okay, not always, but it often sounds mucho-professional or like an encyclopedia-entry. I occasionally have students who write like this naturally, but it's uncommon.

3. Speaking of encyclopedias . . .

Another give-away is the sudden use of specifics:
Cannabis users included nineteenth century literary figures Robert Louis Stevenson, and Le Club des Hashishins members Victor Hugo, Alexandre Dumas, Eli Lilly and Company and others sold cannabis tinctures over the counter for a variety of maladies.
Okay, so maybe the student knows about Robert Louis Stevenson (he did write Dr. Jekyll and Mr. Hyde) but Le Club des Hashishins? Eli Lilly and Company? Eh, I might just feel compelled to check this paper out.

The above is ALSO from Wikipedia (do you honestly think your professors haven't heard of it?).

2. No citations anywhere.

But, you may say, maybe the above student is just using the information from Wikipedia to enhance his/her essay. That could be true, despite the fact that Wikipedia is not a trustworthy source (useful, just not trustworthy). The give-away is when (1) the words are exactly the same; (2) the student has not given credit to Wikipedia in the actual text or on the Reference page. (Really, people, it isn't that hard to put something in quotation marks and say where you found it.)

1. If you're that stupid, I should fail ya.

I think there are strong ethical reasons why plagiarism stinks. From my perspective as an English instructor, I fail plagiarized papers automatically—I can't grade/judge someone's writing if I don't have that person's writing. (I'm not sure how I would react if I were a History instructor—maybe give the student credit for some research? And then remove points for the sources being misused and lousy?)

But I also think there are lots and lots of reasons (see above) why plagiarism is just stupid. If a U.S. college student* has reached this point (college) in his or her academic career and doesn't know that copying an essay wholesale is ethically suspect, can't read my syllabus where I state that plagiarism is not allowed, can't understand the words, "Plagiarism is not allowed in this class," can't tell bad writing from good and hasn't figured out how Googling works yet, why was that student graduated from high school? I mean, if the student can't even cheat intelligently . . .

*Many students from abroad will inadvertently plagiarize since their educational background has emphasized collecting and organizing information over putting information into their own words. However, these students almost always give credit to the sources they have plundered, incorporate the sources into their own writing, and, as soon as I explain plagiarism according to MLA and APA rules, apologize profusely, asking me desperately if I intend to fail them. (No.) Which is all to say: U.S. students, don't try to use this excuse--plagiarizing information in order to "get it right" does not fall into the same category as cutting and pasting an entire document and then pretending it's yours.

If you can fool me, I'll pass you. But then, to fool me, you'd need to know something about writing to begin with.

Yeah.

© Katherine Woodbury
0 comments

Teaching Millennials: Kate's Philosophy In Progress

I don't usually post about my job since this site is supposed to be about popular culture. However, college is, in its own way, becoming more and more of a marketed commodity, so why not!

I should start by saying that teaching is the most exhilarating thing I have ever done, and I care more about it than just about anything I do.

I will now add that teaching is also one of the most depressing things I have ever done. "It's a circus!" as they say in The Fugitive.

There's a karmic pay-off involved here. I get depressed because, as the saying goes, "I care." When students plagiarize, answer cell phones in class, try to manipulate me, argue that because they pay to go to college, they shouldn't be expected to attend class (this week's particularly depressing occurrence), I feel sad (as well as about 1,000 years old) because, well, yes, I care--however, hokey that sounds (and perhaps no job can depress you until it matters to you).

But what do I care about? This is something I started asking myself about a year ago, and I keep chipping away it. What exactly do I expect from students in the first place? I should state here that I don't know whether students today are worse behaved than in the past, whether colleges are expected to service a larger and less prepared generation than in the past, or whether what I deal with is just the way 20-years-olds are. (I think all three of these "whethers" are probably partially true.)

But here is what I believe about my role as a caring teacher.

It started over a year ago. It's the end of the semester. Four of my classes are winding down, one of which is a preparatory writing class. This class started with 20 students; due to attrition, we are down to approximately 17. Out of the 17, at least 10 come to every class, hand in their projects on time, and take every test. Out of the remaining 7, 4 have met the minimum requirements to pass the course. They have no complaints with me. I have no complaints with them.

The remaining 3 have caused me nothing but heartache. For example, one of these students has not met the minimum requirements to pass the course for (what appear to be) completely legitimate and unforeseen reasons. After conferencing with the student (and, believe it or not, being contacted by the student's parent), I extend mercy so long as the student meets certain criteria. The student does not meet the new criteria, instead writing me long, emotionally charged explanations. I extend mercy (and criteria) a second time with the warning that I will not be able to extend mercy a third time out of fairness to the other students. The student does not meet the second set of criteria. I drop the student from the course amidst much wailing and feelings of victimization.

So the Monday after dropping the student, I'm standing in class, and I think, "I'm so sick of this."

I wasn't just sick of dealing with people who don't take advantage of opportunities. I wasn't just sick of my own waffling and indecisiveness. I was sick of a single student occupying so much of my time, my energy, my caring. I looked at my 10 good students--the ones who came and did the work and handed things in on time--and I was suddenly sick and tired of those students not getting the attention they deserved.

They got as much of my attention in class as anyone else. But they didn't get it outside the classroom. Outside the classroom, I was spending more time fielding complaints and pleas from "well-I-know-I-should-be-there-here-are-all-my-reasons-why-I'm-not" students than worrying about how to connect with, for example, an extremely shy good student. My energy, my love of teaching was being squandered and siphoned off into what often amounted to a waste of time.

Colleges have become more and more competitive with each other over students, meaning that instructors like myself are pressured, more and more, to chase after the students who don't come to class, the ones who don't commit themselves, the ones who need to be cajoled, consoled, rallied, encouraged, and nannied into attending their classes and handing things in. I don't mind answering questions, but I've spent an inordinate amount of time dealing with personal life-crises as well as "here are all my reasons for not doing what I've been asked to do" issues. I've spent more time on students who want to hand things in late and students who want to be excused for being tardy and students who want to be given special privileges than on those students who come to class, hand things in, take the tests, and try to learn.

And I'm tired of it. Humans only have so much energy. I only have so much to spare. I'm tired of uncommitted or quasi-committed students getting my attention. I'm tired of good students being appreciated but peripheralized because teachers are expected to concentrate on getting the remaining students to CARE about their education in the first place. I'm tired of good students losing out not from getting A's and passing (of course they do) but from having the teacher's full attention, the teacher's thoughts out of the classroom as well as in it.

A year ago, I began revising my classroom approach so, more and more, what students get out of the classroom experience is their responsibility. I don't mean students do my work (correcting essays, creating tests, holding meetings, devising lessons plans, fulfilling administrative tasks). But more and more, I clarify my philosophy to my students: what it means for them to be there in the classroom; why I include certain expectations on my syllabus.

This approach may seem rather obvious. And to a degree, it is. Teaching, to a degree, is all about confidence. Teachers have to believe in something--some standard, some preferred quality within their students. The more I teach, the more (not less) I believe that a student decides exactly how educated/committed/focused he or she will be. Nobody can distill desire or action into another human being. Education, believe it or not, cannot be forced onto a person, no matter how well-intentioned the educators.

There's a scene from American Idol (several years back). A young man comes in. He can't sing. Simon tells him he isn't going on to the next level. The young man begins to plead, to explain how much it means to him, how much he cares about being on American Idol, how hard he has worked. Finally, in frustration, Simon snaps, "Oh, well, NOW you can sing."

Wanting a thing, I try to clarify for my students, is not the same as earning the thing.

On the other hand . . .

But then I think, "Is my attitude fair?" Perhaps, encouraging students and teaching them what education means while helping them enter adulthood is the purpose of education. I've heard fellow adjuncts talk about how a single professor's encouragement really helped them at a rough time in their lives. I've had success myself pushing a student (gently but forcefully) to finish despite problems at home and/or illness. "You'll thank me later," I tell students á la Monk when I push them to hand things in on schedule no matter how tired or sick of school they feel.

And to be honest, a large part of my weariness is not the legitimate excuses but the expectation of entitlement, the belief that simply having the excuse is enough to make me change my standards, the course requirements, not to mention the definitions of "pass" and "on-time." Not to mention the expectation that I will accept the excuse without question. "Everybody lies," as House says to Wilson.

I feel downright warm and fuzzy when a student says to me, "I'm not going to be in class Wednesday because I'm going to stay up late to watch the baseball game" as compared to the students who tell me their grandparents died, their pets died, their sisters are getting married, etc. etc. etc. Oh, just pull those heart strings already. I recently had a student who, when I refused to say he'd met certain requirements when he didn't, told me (in order): I'm a nice person. I tried hard. This is really important to me. I only didn't do X amount of work. I live far from campus. I had to take care of an ill parent. I need this. I paid for this class. You're ruining my life. Abuse.

When I was in college, I went to class. I never asked for an extension. In four years, I called one professor once when I was throwing up all over the place. Yep, I was one of those students.

I also solved my own problems, hunted up my own answers, figured out my own grades, and thought for myself. It never occurred to me that I would go to a professor for any reason, not even to clarify an assignment (I should have; it just never occurred to me). Granted, I was living 2,000 miles away from home. But my behavior in college was not substantially different from my behavior as a senior in High School. Everything I accomplished academically was due to me reading the assignments, checking the syllabus, figuring out the answers on my own, and following through.

My students puzzle me. I try desperately to remember myself as a 20-years-old (beyond the obsessive independence which led me to move into my own apartment at age 22). "I wasn't in class last week," my students tell me, looking limp and innocent. "You had a syllabus," I say, trying not to sound like Simon. "The assignment was on the syllabus."

"I couldn't do the assignment. I don't know how to write a comparison/contrast essay," they say.

"You've got the freaking Web," I don't say. "I figured out assignments from asking other students and reading my freaking textbooks. You couldn't spend two minutes
Googling 'how to write a comparison/contrast essay?'"

I don't say it. They plagiarize enough from the Web (which makes them remarkably easy to catch). Still, where's the problem-solving? Where's the energy to figure out an assignment on one's own?

Which is when I think, Perhaps I'm right to refuse them too much pity, to refuse--no matter what the excuse--to say, "Oh, well, NOW you can sing."

Wednesday

0 comments

Conservative Parade (and How It Undermined the Liberals in My Office)

This is an old post from 2005. I moved it up because I just linked to it from the homepage.

Last Friday, Portland, ME had a parade.

Portland has parades every time the Patriots win and also, whenever the Red Sox win. So that's that parade for the next 80 odd years. (Sorry, sorry, my huge apologies to Red Sox fans; don't blow up my site or anything.)

Anyway, the parades are kind of silly; usually they consist of about three cars with the sports stars sitting on a firetruck. The crowd waits around for about five hours. The cars pass by. The crowd cheers. The sport stars go over to One City Center and give speeches. The crowd cheers. Everyone goes home. Since I work on Monument Square directly across from City Center, I can watch all this without having to expend any effort.

I will confess right now that even if the parades consisted of what I think parades should consist of, I wouldn't attend if I didn't work where I work; I'm not a parade kind of person. But I do have a definite idea of what parades are suppose to be like since I played in Junior High and High School bands up till my sophmore year.

(1) It has to be freezing.
(2) You have to stand around and freeze for hours.
(3) There have to be bands.
(4) There have to be lots and lots of cars with waving people.
(5) There have to be people twirling batons.
(6) There have to be vendors.
(7) It has to last forever.

So the whole three-cars-and-a-sports-star thing seems really weird to me.

BUT, Friday, we had a real parade. It started as a tribute to the (football) Patriots and then, since a number of our National Guard came home last month, it turned into a tribute to the Maine National Guard (you know, Patriots . . . patriots) and then, since Portland has never been terribly pro-Iraqi war, it grew to include all veterans and then to include all "heroes" (like police officers and firefighters). So we ended up with about four High School bands and a bunch of floats (such as a big Uncle Sam bird) and the Boyscouts carrying a huge American flag and the Red Cross and a whole bunch of National Guard divisions and members of the Airforce (with a great "plane car," very low to the ground, kind of like one of those funny Shriner cars) and helicopters flying overhead and lovely Navy officers in their very sexy uniforms, some Junior High cheerleaders twirling batons, a bunch of tanks, a rock band (on the back of a trunk), and finally the firetrucks with the football players. There were vendors. It was in the daytime, and it wasn't cold, but it did last way too long.

And boy, weren't the people in my office pissed.

For those of you not in the know, my office is fairly liberal. All but three of us voted for Kerry last fall, and pretty much all I listened to for about a year, starting last Spring, was how evil Bush is and how evil Bush's cabinet is and how slutty (seriously, slutty) the amazing Condoleezza Rice is and how horrible Red States are and how stupid and wrong and jerky evangelicals are and so on and so on. And so on. There were even e-mails and anti-conservative cartoons stuck up all over the office--the sort of thing that Wiccans in small towns sue over, except that isn't my style. As you can imagine, however, it hasn't exactly left me with a high opinion of liberal tolerance, reasonableness, balance, understanding or high-mindedness. And my sincere apologies to those liberals who are tolerant, reasonable, high-minded, etc. Oh, and to the amazing Barack Obama, the purple state guy. (NOTE: I still like Obama though I probably won't vote for him.)

Personally, I find it utterly impossible to speak about politics to people who are incapable (and unwilling) to allow that the left has as many crazy extremists as the right and that you can find similar personality types at any point on both sides of the political spectrum. And people who admire Michael Moore are beyond me. If the guy didn't take himself seriously, I could stomach him better. But the sanctimonious "know all" attitude really turns me off. (Not to mention the adolescent "loser" sign he made at the Republican convention; liberals don't strike me as terribly mature people: politically, that is--about a lot of other things, just not politics; again, my apologies to those who are.)

Anyway, the building manager dropped off confetti, and me and three other women threw it out the window. For you Kerry supporters, the three other women were pro-Kerryites. But they didn't see that disagreeing with Bush's approach to the War on Terror meant (1) they weren't patriots and (2) they couldn't or shouldn't support the troops, servicemen, and women who have literally put their lives on the line. Even the resident protestors were out waving their little flags and giving speeches to the press.

But outside of us four confetti throwers, everyone else stayed in their offices and sulked. One guy put a French flag out his window.

Sometimes, it's like being in High School.

These are people who have been totally okay with all the demonstrations that took place on Monument Square in 2004. And even attended some of them. These are people who have been okay with all the (fairly pointless) parades we've had up till Friday's. These are people who have been energetic supporters of "the right to free speech" and still are, on paper at least. But oh my goodness, Friday's parade was just soooo tasteless, as we roll our eyes and make our girl clique gestures. Where is Michael Moore when you want to feel self-satisfied and self-righteous? (By the way, Michael Moore's sister did come to Monument Square last fall, and people from our office went to hear her speak; so did Teddy Kennedy--come to speak, that is.)

The sulkers were a bit hampered as even the daily protestors had figured out. There are lots of people who think that the Iraq War was wrong and/or think that it could have been run better (even I think the latter), but many of those same people will draw the line at dissing or attacking servicemen and women. Some of these people remember Vietnam and they don't want to see a repeat--where people who have agreed to die on behalf of America are treated like pariahs when they get home. We don't want to see another generation of embittered veterans, thank you very much.

I thought the parade was great! Nine months of overhearing (not willingly) conversations about the yuckiness of conservatives and Republicans and Christians, etc. etc. and all it takes is one parade to make up for it. And those aren't my rules. Apparently, according to the sulkers in my office, one conservative-style fiesta = one hundred liberal do-dahs. One conservative commentor apparently = fifty liberal ones, which is why liberals can go on ad nauseum, and with much wrath, about conservative "hate radio" (I will never believe that liberals are automatically nicer, kinder, gentler people than conservatives; if you think that, you haven't worked in my office or gone to my college). It sure gives the conservatives the weight of the odds, doesn't it?

Now, I don't usually talk about politics on this blog. I like to stick to popular culture, which politics is, but this is really a popular culture blog in its own right. Still, this was a parade, which is very much a popular culture event. However, I won't be doing these kinds of popular culture events very often since, as I indicated earlier, I don't really like parades. So I'm going to stick it under History & Learning, even though it should really be under Festivals or something. Enjoy.

Tuesday

2 comments

Why Your English Teacher Told You Not to Use First-Person and Why That Teacher Was Wrong

I occasionally get students who believe they should never use first-person in an essay, especially a research essay. Once upon a time, one of their teachers forbade the use of first-person, and the students took it to heart.

Since "no first-person" inevitably results in bad writing (an overabundance of passive voice; the use of "one" or "student" instead of "I"), I always tell my students, "You may use first-person in my class. In other classes, check with the instructor."

I never thought much about WHY teachers were telling students this. I vaguely remember someone telling me not to use first-person, and I vaguely remember ignoring that someone; other than that, it didn't seem like an important issue.

However, I recently discovered at least one reason teachers ban first-person: prevented from using first-person, students will set aside me-centered thinking and use credible evidence; that is, rather than saying, "I think this, thus it is true," students will write, "According to expert X . . ."

I don't buy this argument; in fact, I think banning first-person usage ends up doing more damage than good. If the problem is the lack of expert/credible sources in students' writing, not using first-person doesn't solve the problem; it just covers it up. After all, a first-person's account could be more credible than an "expert's" account. I'd much rather read a student's personal/eyewitness account of 9/11 than a thousand third-person conspiracy theories.

Associating first-person with subjectivity and therefore, with poor evidence also leads to a logical fallacy:
Since first-person produces less credible/more subjective/more me-centered evidence, then non-first-person must produce more credible/more objective/less me-centered evidence.
Oh, yeah.

Since when?

There's a lot of ridiculous non-first-person evidence out there which has no more credibility than a teenage driver claiming, "I never speed." A claim, introduced with an "I" or not, is still a claim, and any claim is disputable (as is all evidence).

I've seen the results of this logical fallacy in my students' writing; they confuse claims with support, thinking any statement without "I" is evidence (there's a huge difference between arguing, "Cats make great pets" and proving that cats make great pets). They also confuse claims with facts, thinking any statement without "I" is a fact: The United States is having a recession. Newsweek says so. I can use this in my paper!

All evidence/claims are testable, both personal evidence ("I experienced") and non-personal evidence. Determining credible evidence has nothing to do with first-person and everything to do with the credibility of the speaker/researcher/study/source.

In a well-intention desire to prevent excessive grandstanding, teachers who ban first-person are confusing cause with effect. A superfluity of "I think that . . ." "I believe that . . ." "I must be right because . . ." may be the result of a me-centered culture (and can get annoying), but it has little to nothing to do with whether the speaker can actually be trusted or whether the speaker's evidence is meritorious. I often tell my students, "Personal evidence is the strongest evidence you have; it just isn't enough except to your parents and your friends." But to say that personal evidence carries no weight at all is such an obvious untruth that students are liable to follow the teacher's instructions while missing the point. The result is terrible critical thinkers and even worse skeptics (exchanging one mass of information--my own--for someone else's mass of information doesn't lend itself to objective reasoning).

By excising personal experience as credible evidence, students will not only not learn about evidence, they will also (bizarrely enough) learn that rules of evidence don't apply to them. Personal evidence can be judged against a rubric as much as any kind of evidence; by divorcing personal evidence from the quest for credibility, the students' "me-centeredness" has been enforced. Hence my master's program--where a fellow student told me that all literature before 1970 is worthless because it's patriarchal, that the student's own warm and fuzzy feelings were of more objective worth than said texts, and that, furthermore, some political theoretician agreed with the student, all without the introduction of "I". Not that "I" would make it better; stupidity is still stupidity.

I don't think "me-centered" arguments in the college environment will go away until* students are forced to be intelligent (but not cynical) about information. We live in a media-saturated culture. This is good! Dismissing the not-so-great parts of that media saturation (Facebook, blogs--hee hee--the focus on the self) doesn't help anybody. Helping students recognize and assess it might.

*I don't think they will ever go away actually--college students have been "all about me" since Parisian college students in the 1300s used form letters to write home for money. Really. I'm not making that up.
3 comments

Nature, Judith Rich Harris, and My Theory of Memory

I just finished Judith Rich Harris' No Two Alike, a fascinating book. I am not going to attempt to summarize all her points in this post--that's what the book is for! (Buy it! Borrow it!) I am going to respond to a very specific issue.

Here's the part I am going to summarize. Harris is trying to explain where personality variation comes from. Using controlled studies, evolutionary psychologists and behavioral geneticists (Harris herself is an amateur insofar as a person who synthesizes and clearly explains extremely complex scientific ideas can be called an amateur) have shown that genetics account for approximately 50% of personality. Something between 0-10% of a person's personality is due to his or her homelife. The rest is anyone's guess.

At this point, I have to clarify that when Harris refers to 0-10% of a person's personality being due to his or her homelife, she is talking about one's homelife actually forming personality (a belief promoted by developmentalists). Harris believes that the remaining "rest" is due to the environment in the sense that she believes that it is due to evolutionary factors that work themselves out in the environment. I believe Harris is making the distinction between an environment forming a person and a person bringing his/her genetics and homelife (however inconsequential) to bear on an environment.

Because Harris is an "amateur" evolutionary psychologist, she looks for the answer to the "rest" in evolutionary psychology, not in poetry or philosophy. She presents the need to determine relationships (who can I trust? who can I not trust?), the need to socialize (how do I get along?) and the need for status (how can I survive by getting resources?) as the three factors that make up the "rest" of a person's personality (I'm super paraphrasing). It is in the juggling of these three factors that personality becomes differentiated.

And I more or less think she is right. This doesn't contradict my religious beliefs since I believe one of the purposes of mortality is to experience mortality which seems redundant but does include things like evolution. (I also believe--side note--that genetics is the best defense for free will; as products of environments, we would never get the chance to form individual personalities. Our current environment would mold us instead. I also believe--double side note--that free will is a much more specific instrument than it sounds; I don't think free will means, "Creating my own personality free from outside pressures!" I think free will means, "Being or not being a total dork"--at least from a religious standpoint.)

However, believing Harris is right doesn't prevent me from understanding why she freaks out the developmentalists. (Harris wrote a book called the Nurture Assumption that apparently got the developmentalists very upset). You think the Middle East has problems, check out academic camps regarding nature versus nurture versus evolutionary biology. Yikes!

And Harris, unfortunately (for the developmentalists), is able to point to extremely slipshod trials and experiments run by developmentalists. But this actually brings me to my own (slight) problem with Harris. I am not an amateur evolutionary psychologist. I'm an English teacher. My slight problem arises from inside the gap between evolutionary psychology and the human experience or what humans communicate about themselves.

The developmentalists are upset because Harris states that one's homelife does not form one's personality. It isn't useful to say that parents who go out of their way to attend parenting classes produce better children who become better parents because parents who go out of their way to attend parenting classes are the parents who care about issues of parenting in the first place and will pass on said genes to their kids. (This actually happens in teaching all the time; the teachers who attend the boring required teaching courses often happen to be the best/most dedicated teachers; that doesn't mean the teaching courses are any good.)

Now, partly the developmentalists are upset because their egos are bruised and because there's a whole industry out there built on giving advice to parents to make them better parents (and what? everyone is suppose to just not care?), but I think the developmentalists are upset for another reason as well. (And I think Harris is a little dismissive here; she seems to think that parents are upset about their lack of influence for reasons that have nothing to do with the claim itself.)

Society is filled with conventional wisdom and commonsense wisdom: everybody thinks it and that just makes sense. Now, I'm not a huge fan of conventional wisdom. EVERYBODY thinks that global warming is due to pollution and that, if not stopped, the world as we know it will fall apart at the seams. Yeah, well, it's always something, isn't it?

But I am a huge fan of commonsense wisdom. That is, I do believe that human beings are some of the best people to ask if you want to know what is going on with human beings. So if people have been writing about the impact that parents have on children for thousands of years, I'm kind of going to think they probably do.

Harris argues that this connection between homelife/parenting and personality is relatively new, and she's sort of right (in terms of the obsession and the blame). Ancient Roman fathers may have worried about being good examples to their sons, but I'm not sure how much they blamed themselves if their sons turned out badly. I think they blamed the kid. Or society. In other words, the dependence on socialization in producing decent people has been higher throughout history than dependence on the family.

Except for the nagging problem that you have whole swaths of human beings who for much of history didn't socialize with anyone but their families. I'm not talking about hunters/gatherers where a large percentage of individuals were related. I mean, the pioneers (for one example), sitting out there in some cabin hundreds of miles from nowhere (Laura Ingalls Wilder, anyone?).

Now, note, that Harris and I are talking about two different things. She is talking about the formation of personality and would argue, based on my understanding of her book, that in the absence of an obvious outside social structure, the child will go looking for one. It is vital to the child's growth to respond to something other than ma and pa.

I, however, am talking about influence.

But this is where I feel there is a gap in Harris' arguments. Because that influence exists and human beings talk about it all the time, both historically and contemporarily. People claim influence from their parents. It is easy to say, "Well, you say your parents taught you to be honest, but the fact is you inherited a predisposition for honesty," but when you are relying on those same people to tell you about their personalities, it seems a bit churlish ("a little weird" is my totally unscientific response).

And I think this is where the nature folks lose adherents. I don't think most people are frightened of genetic determinism (why genetic determinism would be any more threatening than environmental determinism is beyond me), but I think there is a reluctance to undermine one's own understanding of one's experience. Commonsense tells us that our personal understanding carries weight. Historical documents assure us that people have always expended energy on their own thought processes. Since I live in my own head and since my personality is (however constructed) my own, I'm hardly going to trust anyone who tells me to ignore my own reason or my own senses. Good grief, Jane Austen didn't. Why would I?

This brings us to my own theory which is the theory of memory. Actually, Harris could probably incorporate my theory of memory into one of her factors (systems), but I separate it because it brings homelife back into the game. I believe with Harris that we tell ourselves stories (explain ourselves to ourselves), but I also believe that those stories, specifically the memories we select to tell ourselves those stories, have tremendous weight. I believe people start creating memory stories (these memories describe me and my experience) as soon as their reasoning skills develop adequately. I also believe that people are drawn to creating a memory story exclusive to their homelife: holidays, vacations, family dinners or lack thereof.

Commonsense (and cognitive learning theories) state that the pathways formed by the selection and repetition of certain memories has something to do with how a person operates. We can change our stories of course, but I find it almost impossible to believe (in terms of commonsense) that a homelife that supplied few positive memories would result in a hugely positive homelife memory story. Gotta work with something. And I also find it difficult to believe that a positive or negative homelife memory story won't influence me in terms of my choices (and I believe that while personality may not be the result of choice but rather the determination that leads to choice, looking at choice is really the only way personality CAN be determined. You can look at a DNA strand or brain scan, but you have to put the owner of the DNA/brain in motion to determine anything about the relationship of the strand or scan to the owner. Otherwise, evolutionary psychology falls into the category of "really, really boring.").

Here is where Harris and I would agree (I think): the homelife doesn't determine what memory story I create or even how much time I focus on creating and repeating a memory story ("time spent" could be genetic). Homelife simply supplies evidence. Other factors will determine what/how I create the memory story but once created, it does carry influence, and that influence impacts my choices outside the home.

Now, it is possible that my desire to create a memory story based on my homelife is a result of experiences outside the home: I come from cultures (American and Mormon) that place a high premium on what happened in my childhood home. But here's where Harris can't have it both ways. If socialization is a factor in our personalities, and if we come from societies which place a high premium on our homelife/experiences with our parents (and most of us do), then we have been socialized to take our homelife seriously which is going to impact our personalities.

And we may not even know it, according to Harris who argues that socialization is largely unconscious. We adopt the patterns of speech and behaviors of our culture in order to conform/operate. This isn't a bad thing; it's survival. If we didn't, we couldn't communicate (and I believe Harris is right that operating successfully as a social animal comes down to communication). However, an invisible force leaves room for other invisible forces. I doubt very much that overt discipline in the home greatly alters a child's personality (unless, as Harris points out, such discipline is relentlessly severe), but I do believe that the influence of memory (what happened to me yesterday; what my parents did this morning and the morning before that; what I tell myself about what happened) does. It just doesn't make any (common)sense that it wouldn't.

I should end by reminding the reader that I am more on Harris' side than opposed to her. I recently took a (very boring) education class where the material instructed me that girls and boys don't always learn information the same way. The material also instructed me not to be sexist and not to harass the boys--oh, wait, that was a different class--and frankly, what the material had to say about male/female learning differences was pretty shallow. But still, I was pleased to know that one is (finally) allowed to say that genes and biology make a difference. It's about time!
0 comments

Education Course

I took an education course this summer that I recently, as in just, finished (I am hoping to get certified sometime in, oh, the next twenty years). Education courses are often villified as boring, stupid, wasteful, and unimaginative. I will admit that the main textbook for the course was mind-numbing, and I do not think I would have benefited from taking the course on-campus. However, I took the course on-line. The professor was excellent, and the assignments were extremely effective. I have made several changes to my lesson plans based on what I learned. And I learned things I didn't know, or at least things I wouldn't have bothered to find out about if I hadn't been forced (like rubrics).

I still have a number of reservations about the philosophies behind the course, which I talk about below. My reservations can be summed up by a line from Diana Trilling (written in 1981!): "[T]he young are only as virtuous as they grow up to be and . . . the educational process doesn't improve their future prospects by flattering their present moral capacities." Add "writing and grammar" after "moral" and that's exactly what I believe.

My research paper for the course can be found at Papers. It deals with language transfer issues in grammar courses.

Now I'm ready to teach in the fall!

**********************************************

"Educating the Exceptional Student in the Classroom" has been an insightful class. I still have mixed feelings about the philosophies of inclusion, accommodation and differentiated instruction; however, I find the processes embedded in these philosophies very helpful.

As I stated in my first reflective paper, I admire the principle of inclusion. Through this course, I have realized that inclusion does not mean lowering standards or giving students a free ride, but I still wonder if the demands of inclusion may lead to lowered standards. Teachers will spread themselves too thin. In an effort to meet everyone's needs, fewer and fewer demands will be made. As I suggested in my first reflective paper, collaboration can ease the burden of multiple needs, but collaboration itself involves extra work and (often unavailable) time. Without a structured plan (which, to be fair, the IEP requires), accommodations, rather than being temporary aids, can become barriers—engrained into the classroom culture and anticipated/expected by students.

My worries stem from my experience as a college adjunct. I am often unsettled by how much accommodation my students expect and how unready they are for working life or, even, academic application. When I, following procedure, release them from certain obligations, I wonder, "Am I really helping?" Or, rather, am I making the transition to "the real world" that much harder? In the "real world," grammar mistakes turn off potential employers; deadlines must be met; continual absences result in being fired; fellow employees don't fill in the blanks or do our work for us.

I'm harking back to my initial reservations regarding IDEA. Where is the line between assisting someone and preventing that person from individual growth and personal understanding? In a conference for English teachers, one professor remarked that some of her most fruitful learning experiences occurred when she failed. How, she wanted to know, can we give students similar experiences? (And where is the line between forcing kids to fail and allowing them, for their own good, to fail?)

Again, I do not believe that the philosophies of inclusion and accommodation automatically prevent learning and growth. Scaffolding, or modeling, provides a student with completed steps that are removed as the student progresses. If I want to teach students how to recognize and correct run-on sentences, I should first "model" a run-on sentence and show how it can be corrected. Eventually, the students will no longer need the model. That is the ideal.

Students suffer if the scaffolding is never removed. Likewise, students suffer if they are never required to process information outside their learning styles/comfort zones. Fewer and fewer of my students have been drilled in grammar (the older ones, yes; the younger ones, no), such as sentence diagramming. As I discuss in my research paper, grammar drills are not (necessarily) the best way to learn a language, but, in the absence of other approaches, they are better than nothing! I wonder if the current emphasis on context (provide a reason and application for every skill) has produced students without any groundwork in basic skills (and consequently an inability to build on those skills). Is my students' lack of readiness typical of twenty-year-olds or is it the result of inclusion, accommodation, and differentiated instruction applied inaccurately and/or hurriedly by overworked teachers? I don't know the answer.

Despite my reservations, I admire the intentions of inclusion, accommodation, and/or differentiated instruction; not all kids learn the same or at the same rate. Teaching is a creative process that involves constant re-evaluation: Is this lesson working? Could it be better? Who will understand it? How many students will it reach? Ultimately the teacher's job is to communicate, not simply to present information and hope the students got it. From this perspective, "Educating the Exceptional Student in the Classroom" has been helpful and enlightening, not to say engrossing! I particularly enjoyed the assignments that involved problem-solving. Analyzing extant lesson plans forced me to imagine a classroom of multiple responses and abilities. Creating my own lesson plans forced me to tweak old ideas, to examine how a lesson flows and where I can involve students more. Professor Soderstrom has also modeled approaches that I find useful, such as the outcome rubrics. I have always found it difficult to communicate progression and achievement to my students. The rubric is a possible solution.

The lesson plans also provided a useful model. I now use the same layout for my grammar and composition lessons. The "Pre-requisite" section helps me pinpoint exactly what each lesson should build on. If I haven't covered the pre-requisites, perhaps I should alter the lesson plan! I also enjoyed creating the WebQuest. I devised an interactive document that I hope to use when I teach on-line this fall. Again, the activity involved problem-solving: Can students follow the quest? Understand the main points? Are the websites accessible? Readable? Usable?

In the final analysis, I consider usability the key factor in teaching. Usability is also the aim of inclusion, accommodation, and differentiated instruction. Does a lesson enable the student? Does the student have the necessary skills to succeed? In the past two years, I have seen a welcome change in English composition courses in terms of usability. Rather than focusing on output and literary writing, the focus is now on portfolios, revision, and clear communication. Today's employers don't always require literary analysis, but they do require professional prose.

From this angle, I am entirely in agreement with the philosophies covered in this course. I only hope that I and other teachers can employ the philosophies responsibly.

Friday

0 comments

The Wonder of Proof (thank you white males)

CASE 1: When David Irving brought his libel suit against Deborah Lipstadt, Deborah Lipstadt's lawyers hired the historian Richard J. Evans to examine David Irving's history books about Hitler and Dresden and World War II. In his examination, Evans and his assistants discovered that Irving had consistently misread, misinterpreted, misquoted and deliberately obscured the documentation upon which his books were supposedly based. Evans was able to do this because he traced back the many, many footnotes in Irving's books to the written documentation (letters, municipal documents) itself.

CASE 2: The Salem Witch trials were ended by the same culture in which they began. As the trials went on and on and more and more people were accused and hanged, Puritan ministers and lawyers began to speak their doubts. They believed in witches, but they weren't too keen on the methods by which witches were accused--namely, oral testimony. Where's the proof? they said. Where's the evidence? Eventually, these skeptics were heard, a new set of judges was appointed, and the trials ended.

I give these two cases to illustrate the importance of the academic, scholarly, educated concept of proof and secondly, the importance of written documentation. This is an ongoing argument in my master's program. This week, we read Henry Glassie. Henry Glassie has some interesting things to say but, as I foresaw, the statement that everyone fell on was not Glassie's call for historians to be vigorous, ongoing and dynamic in their scholarship but his statement that, heaven help us, histories are "either all history or all folk histories." That is, both academic history and oral history (or folk/myth history) string together narratives from the facts available. Ergo, they are the same, should be treated the same, and given the same weight.

I and one other student argued against this. I am not disputing that history (as it is told and taught) doesn't involve a narrative. Neither will I dispute, as one student pointed out, that oral histories (like academic histories) are both trying to be truthful, that both have checking systems, that both come out of particular cultures, that both change over time. Neither will I dispute that academic, written history--thank goodness--has more weight or power than oral history. What I did dispute was the expectations or process to which either is submitted. What I could not communicate (which is, frankly, why I'm trying to do it here) is that confusing the two does not, in the long term, aid either one.

If all histories are one then Irving's oral testimony of his own good intent bore as much weight, if not more, than his misuse of written documentation. The historians in that trial were only able to undermine Irving's claims because they could access written documentation that could be disputed, argued over, debated, re-interpreted and checked. This is almost impossible to do with oral testimony. One student argued that students are taught to take anything written as automatically true. I have a difficult time replying to people who say things like this, because I am afraid I will say something nasty like, "Well, stupid people do." Stupid people also believe anything that is said to them, simply because it is said. Good historians question what is written, who wrote it and why, just as any of us do about an oral tale. That doesn't undermine the superiority (historically) of a written text, which can be brought forward and studied, over oral testimony which morphs, through time, beyond recognition (the other student actually made this argument).

The fact is, if all histories are one, then the oral testimony of the accusers in Salem Witch trials had as much bearing as the requirement of proof. Vice versa--which no one seemed to realize--if all histories are one, then I have as much right to demand empirical proof from folk histories as I do from academic histories. If the standard is the same, then oral folk histories (which have, in general, a different purpose and standard from academic histories, despite the occasional similiarities) could be submitted to the same requirement of proof. I mention this because it is the latter demand (that folk histories submit themselves to rigorous scholarly approval) that upsets pro-oral history folks. But, by insisting that the one is as historically valid as the other, they are causing this confusion to take place. I am not declaring that oral histories and folktales are unimportant; I am myself a big fan of folktales, myth and faith-based theologies. But I don't demand, unlike the creationists, that faith-based theologies be submitted to the same standards of empirical and historical proof as science and history and, even, critical theory. In my mind, this preserves both from degredation. I also believe that the importance of empirical and historical proof can never be understated. We thrive as a culture because of it. Western civilization is very flawed and makes many mistakes; but it also progresses, improves and fixes itself due to the demand that one must have evidence, proof, written or artifact, that other people can see and discuss.

Otherwise, in the end, the only people who will win are the people whose political side (whether academic or oral) happens to be winning. (And frankly, I sometimes think that is all the relativists want: to win.) For instance, the requirement of proof (in America) arises out of a white, patriarchal, academic cultural mindset and that seems to be enough reason to despise it. Our class on Tuesday was filled with statements like, "All histories are cultural constructions." When I put forward the idea that yes, alright, every historian speaks out of a particular frame of mind so we need to study written histories over time, I was informed that past histories were written by white males. The impliciation was that such histories are too narrow to be of any worth.

If I thought this cavalier attitude towards proof/the past/written documentation and knowledge was in the ascendent, I would get really depressed. But I don't think it is. I've noticed in younger scholars a vague contempt for the relativism and self-importance of this "we can't really know anything" attitude. They know they can't be completely objective, but they aren't willing to abandon the idea of standards or of a canon--the idea that some books are better than others, that some histories are more accurate than others. They aren't willing to throw all power into the hands of the "nobody knows the truth" relativists, who want to dismiss the burden of proof as male, white, patriarchal and power-hungry. As Cynthia Ellter points out in her book The Myth of Matriarchal Prehistory, feminists hurt themselves more than male culture when they promote an imaginary past not as lore (which would be acceptable by the standards of lore) but as a history (which is not acceptable by the standards of academe) that ought to be believed in as history. If that is the best feminists have to offer, and if they continue to insist that "good" feminists ought to accept this kind of relativist tryanny, I'll stick with patriarchy, thank you very much.

Monday

0 comments

My Theory About the Purpose of Higher Education

I've been thinking about this a lot lately. If you read this blog, you know that I often complain about my college. However, I'm not opposed to the existence of higher education, I just think it has gone hay-wired. (Or never had what I wanted--also possible.) So I'm going to try to put down here what I do think the purpose is supposed to be.

First of all, I don't think it is supposed to pompous (think Harold Bloom), either loaded down with academic verbiage or devoted only to the study of particular kinds of culture. On the other hand, I think the academic world is based on a belief: that it is possible to teach stuff over time, and that stuff will still be there every time you teach it. And instead of apologizing all over the place for this...this...dare we say, conservatism, the academic world should just say, "Look, this is what we do. We teach that there are things that can be taught."

For example, in my current class on folklore, we have read a number of articles and had a number of interesting discussions, but the professor hasn't taught us any of the theories of folklore that have influenced scholarly opinions.

Now, as you may know, I'm not a big believer in using hypothetical theories to find hypothetical profundities, and if that was the professor's feeling as well, I probably wouldn't care, but it isn't; we spend many classes having hypothetical discussions about hypothetical profundities. If we're going to do this, I think we should know the scholarship that makes it possible. (Then we can decide whether the scholars are right or not.)

So I think the academic world shouldn't apologize for being scholarly. It shouldn't be obnoxious and ivory tower-ish about it either. But it shouldn't apologize. The academic world should accept that it has a purpose. It might not be a particularly noble or relevant purpose but who cares. Its purpose is to teach scholarship, teach what people have written about what other people have written and done through time.

In promoting this, I feel that I'm some kind of throw back to medievalism. Or maybe I just read WAY too much C.S. Lewis as a kid. (Probably.) Lewis was a big fan of the narrow and deep approach. In my case, I feel that the Humanities (in particular) went off the rails when they thought they had to come up with teaching styles that covered economics and race and gender. And then they decided they had to be diverse, which meant they had to be sorry that so many of their texts were written by men and were written, not oral. Shock. Shock. I'm not opposed to reading things by women nor am I opposed to reading/studying oral cultures. But I don't think the academic world should abandon what it is in order to show how sorry it is that it isn't something else. Because if it keeps doing that, it's just going to vanish.

To my own subject: I think when literature is taught, it ought to be taught for/as what it is. I mean, nobody expects Stephen Hawkins to explain the particular importance of his universe model in terms of the economic realities of minority single mothers living in urban centers. (And his theories might be amazing, but they have very little relevance to the day-to-day.) As a teacher myself, I'm not a big fan of enlightening people or engendering appreciation or whatever. I just want to teach people to understand a subject, and I think it would be fantastic if the Humanities could reach the point where people would talk about, say, Homer, just because he's dead, and he wrote great poetry and here's how he did it, and here's why people cared, and people may not care anymore, but they should still know that once someone did care just because it's good to know stuff like that about books, like it is good to know about Newton and Galileo. Not even because it makes us better people. Just cause knowing stuff is good. (I'm going to print that on a T-shirt: Knowing Stuff is Good.)

But maybe I'm expecting too much. Is it better to have people vaguely educated in a lot of things with a lot of vague concepts rattling around in their heads or is it better to have people know a lot of information and context and specific details about a very few things?

Actually, I don't know the answer to that.

Friday

0 comments

Welcome to Academe

It started when the professor asked us if there is a difference between history and lore; aren't they really just the same thing?

Okay, I'm game. I spoke up and said that history, or the goal of history, works within the principle of empirical evidence while lore is an attempt to get at the emotional truth or reality of an event, which evidence can't always discover. I didn't make that last bit up. It was in our reading for Tuesday's class, which I cited in class.

At which point, a woman who I have had several classes with in the course of the program, spoke up. I'm going to call her Traci. Traci declared that it really bugged her that history is perceived as true while lore is perceived as false (myth); history, she told us, is male and patriarchal and heirarchal and part of the dominant narrative, and it's all b.s.; she wasn't really responding to me, but I responded nevertheless. I pointed out that her statement was a little over-generalized; I'm a woman, and I happen to think empirical evidence is swell (or words to that effect).

She replied by saying that she wasn't talking about my relationship to history, she was talking about history as it has come down to us (through the schools): "the way we're trained to think," she said.

"Doesn't bother me," I said. At which point, another student pointed out that lore isn't exactly non-patriarchal since many of the stories that are passed down through folk culture are male-oriented and male-told. Traci retreated to the position of "well, it's all about social class" (everyone in this college, except me, eventually retreats to the "it's all about social class" position).

I suppose anyone who has been in a college course where this topic has been discussed knows where this is headed. I knew where it was headed, although I figured the professor would push it off track before it got there. He didn't.

In any case, since I had made myself the object of attention, he directed his next few questions at me. Basically, he was arguing the concept that people live in their own heads; what they perceive as real is true to them. There is a great deal of validity to this idea, but I wasn't prepared to jump on the bandwagon whole-hog, considering the underlying assumptions. I especially wasn't willing to accept that history & lore are just two variations of the same thing: the creation of a narrative. (I realize they can be, but I was thinking of history as it had been defined in the class: male, empirical, academic, factual, etc.)

The professor brought up that book Million Little Bits, or whatever it is called. If the guy who wrote the book really believed that those things happened to him, and the book is based on his memory, can we really get into a bruhaha over his "reality."

"Yes!" I said.

Now, I honestly don't care about the book or the bruhaha, but I said, "The guy went on shows and people believed him and those 'facts' were out there, and we have a responsibility to the truth, to be accurate about things."

Anyway, at this point it was only a matter of time. Sure enough, we hit the biggest issue of them all: the Holocaust. If all reality is relative, and if Holocaust deniers really believe that the Holocaust didn't happen, doesn't that mean that, for them, that "reality" is true.

Which is why, as I tried to point out way at the beginning of the class, you don't confuse history, or the pursuit of history, no matter how male and patriarchal, with lore, or the pursuit of lore.

"After all," said one student, "does it really matter if it was 1 million or 5 million people who died?"

Now, before I continue, I should state that I do not think the student was making a deliberate anti-semitic or denial statement. I think he has no idea that most Holocaust denial centers around reducing the number of Jews who died in concentration camps in Europe. He was (simply) responding to the "it's all about the emotion of the event" relativism that was being promoted (by the professor in the class).

And I went nuts. I said, "Yes, it matters! It matters to the people who died."

"But," he said, "does it matter in terms of the stories that are collected? The stories don't have anything to do with statistics. What's the difference between 5 million and 1 million?"

Now, notice that the business of statistics (empirical history) and lore (emotional content) has gotten conflated. Have I mentioned that the two are distinct and should be dealt with as distinct?

I said, "A lot. It matters. It matters to the people who died; it matters to the families who were affected. It matters what battles were fought in what towns, and what people disappeared from where. Statisticians know--they can figure out--that 5.8 to 6 million Jews disappeared from Europe. And that matters." At which point, I got a tad more sarcastic about the flightiness of relativity.

"You keep saying it matters," he said, "but you can't say why."

And I just gaped. I'd given him some fairly solid emotional reasons. Do I actually have to defend the need for historical veracity?

Another student said, "It matters in terms of the stories that are lost," and then went on to say, "If the number isn't kept consistent, then it changes every time a story is told, and eventually that truth has been lost."

"Look," I broke in, "there is the lore, and then there is the statistical evidence, and I'm willing to say that as far as the lore is concerned, the statistical evidence isn't important," at which point another student sighed as if, well, duh, finally the whacked out chick has gotten on board. I could have hit him. I continued, "But you can't take lore and apply it to evidence. You set it aside. You don't confuse the two. You don't let lore determine evidence." Except I was a lot angrier and hostile than it sounds reading it.

And then, Traci piped up again. I think Traci was a bit dismayed by the direction the conversation had taken. History, according to Traci, isn't all b.s. anymore, but the statisticians miss things, and, she continued, looking at me, she thinks the academic world doesn't take the emotional/lore side of things seriously enough.

I just stared at her. I was so angry, I think I could have chewed concrete at that point. I wasn't the one who decided history was evil (leading to a stupid conversation in which historical facts and relativistic lore were constantly confused); I wasn't the one who made the system oppositional and binary. I never said anything that disagreed with her final statement. As the student next to me said, "We want to keep the baby and the bathwater." (Another student said, "The two [history and lore] have different functions," which I also went along with.)

I kept a very low profile for the rest of class so I wouldn't throw staplers or anything at people, and I realized, actually, yes, Traci and I do disagree. Because when Traci says, "Academe ignores the way people in history are feeling," she doesn't mean, as I do, that the academic world has a tendency to be extremely literal and label-happy. Traci is a literal, label-happy person who constantly makes negative, stereotyped and generalized statements about Christianity, history, people she disagrees with and, for that matter, stuff she does agree with, like paganism (which she romanticizes as female and environmentally friendly). When Traci says, "Academe ignores the way people are feeling," she means, "Academe ignores the lack of power that people have." Or the excessive power that people have. Or how unfair everything in history is.

And I felt, in a very literal way, sick. I honestly can't say whether, if I hadn't been there (and believe me, I wish I hadn't been), anyone would have bothered to argue with the professor's initial point (which was more of a devil's advocate position than anything else). Despite the few students who backed me up, the general attitude was that I was making a fuss out of nothing. Being historically accurate is all very well, but we here in this Master's program are more interested in saying things like, "It's all in people's heads," and bringing up the concept of ideologies: can we blame it on race, class, gender, power? If I had been absent, I think that, with maybe one or two exceptions, the professor's "aren't history and lore just variations on a theme?" idea would have been accepted as a truism.

I don't think this is a minor issue, although it is obviously a pointless battle for me to fight. I realize, in retrospect, that a great deal of confusion was caused by language: history (as an empirical study) versus history (as learned narratives) versus lore (as rumor and hearsay) versus lore (as constructed narratives) versus lore (as experienced realities). However, I was fairly clear that I was referring to history as an empirical study, and the class in general commonly mixed history as an empirical study (statistics, numbers, verifiable data) with the remaining definitions. Have I mentioned that history shouldn't be confused with . . . . oh, never mind. In any case, the difference, as far as I can tell, between me and the general consensus is that I think history (study or narrative) is held to a (male, patriarchal and, by the way, excellent) standard that lore is not. (At one point, when Traci was getting upset about history as it is taught, I said, "But how could you teach lore?" meaning not the study of lore, but the emotional resonance of any subject. Especially, since it is hard enough to get kids to understand that the Civil War happened before WWI.) I actually think lore should be held to a standard as well; to me, the emotional truth of a subject isn't the label (see below) but how well it sits with the evidence. It's all very well for Traci to say that paganism was sweet and matriarchal and environmentally gracious, but that doesn't really fit with what we know about the Roman Empire, the Greeks, Syrians, Babylonians, Celts, Huns, Vikings, Egyptians . . . However, I wasn't going to go down that path; my acceptance of lore as non-empirical and non-verifiable was my concession to the argument.

In any case, it seems to me that everyone else was far more interested in discussing things in terms of theory in which case it doesn't matter, to them at least, whether you mixed statistics with relativity. Label, label, label, who has got the label. As I have stated before, and I will say again, discussions in the academic world about class, race, gender and power are overrated, oversimplified, superficial and unbalanced. Theory and label are all very well, but if you don't accept the evidence, how can you argue about things you know nothing about? And why are you bothering? Make up your own story, already. (I was, I have to admit, really very annoying; at one point, when the teacher was talking about how scholars try to figure out why people tell the stories that they do, I said, "So scholars just throw a bunch of meanings at a folkstory and hope one of them sticks?" He didn't really like that, although Traci laughed.)

It's not like this experience is new to me. I've been in four classes with Traci, for instance, and she says the same kinds of things in every class, things that to me seem outrageously bigoted. But since empirical history is evil and male and patriarchal, she doesn't have to answer for her prejudices (which I honestly believe she thinks she doesn't have). Since she's anti-academe, she finds my anti-academe comments amusing, but knowledge, for Traci, is all about getting power back (from evil, white males). It isn't about truth: factual or emotional. She doesn't even seem to care what people actually experience since she was ready to dismiss my non-angry experience with male academic culture; what she seems to care about is whether she can label what people are experiencing in terms of power or victimhood.

I don't think that most students go as far as Traci, but Tuesday's class made it clear to me that I am even more alone in this higher education experience than I had imagined. Because if the students are more tempered, in general, than Traci, they continue to support the basic, underlying assumptions that frame Traci's thinking.

Welcome to Academe.