In the News: Professor Pullum’s Five Points

An article written by Tom Chivers for The Telegraph, a London-based newspaper, has recently gone viral—partially, no doubt, because of Mr. Chivers’ eye-catching title: “Are ‘grammar Nazis’ ruining the English language?” His subtitle reads, “Split infinitives make them shudder and they’d never end a sentence with a preposition. But linguist Geoffrey Pullum has a message for all grammar pedants: you’re wrong.”

The article has been quoted and shared on several language websites I follow and has been linked on the linguistics and grammar subreddits on Reddit.com. The responses to the article have been quite predictable: readers are all ready to take up their metaphorical swords and shields and go to war, whether they stand for grammar by the book or an ever-evolving language.

As you read in my first post on this blog, I would much rather camp out in the middle or, better yet, not be on the battlefield at all, drawing up peace terms in an undisclosed neutral location. Although Mr. Chivers makes a clear case against “grammar Nazis” in his title, I think his interview with Professor Pullum reveals a less combative point of view I can stand behind.

Professor Geoffrey Pullum is one of the most famous linguists in the world. He co-authored The Cambridge Grammar of the English Language in 2002, a new English grammar book for the twenty-first century. He also regularly contributes to Language Log, a very popular linguistics blog. He’s currently the Professor of General Linguistics at the University of Edinburgh. Basically, he knows his stuff.

Professor Pullum vehemently argues for descriptive grammar. He makes several key points I would like to discuss.


Point One: Standard English is necessary for all speakers of the language to communicate effectively.

“’It’s entirely to the benefit of all of us that newspapers’ editorials are written in Standard English, and that we can all speak it in situations such as business and air traffic control, and understand each other.’”

My Thoughts:

Agreed. Standard English is important to preserve because it gives us all common ground. Mr. Chivers adds that Standard English is vital to success in the business world, and children should learn it as a key to a better life. While that may be important, the bigger issue here lies in the idea of a standard language. Different dialects of English develop everywhere, from urban Detroit to the Caribbean to Scotland to Australia to India—they aren’t all the same. If we don’t teach Standard English to children (as opposed to their regional dialect), we are limiting them to their own geographical location and diminishing their potential for successful interaction with other parts of the world. If English truly is a lingua franca (which, of course, is up for debate), a strong grasp of how it works and how to use it to effectively communicate ideas is absolutely necessary.


Point Two: Grammatical rules come down to personal preference, and you shouldn’t try to impose your preferences on others.

“I ask him if he has any personal dislikes, which aren’t ‘wrong’ but which annoy him. ‘I have bugbears. But I like to think I have a healthy attitude towards them: they’re my bugbears, so I regulate my own usage, not yours. For example, I’ve always disliked the term “people of colour”. I refused to use it even when I was a graduate dean working on affirmative action in the Eighties in California.’ The key, he says, is to realise that your preferences—using ‘fewer’ instead of ‘less’ when referring to plural objects, for instance—are just preferences, and claiming that they’re ‘wrong’ is false.”

My Thoughts:

This sounds nice in theory, but it seems pretty impractical to me. Everyone who uses the English language imposes preferences on others simply by using it. If I say “fewer cookies” rather than “less cookies,” I’m making it quite obvious what I prefer. It’s hard for me not to condemn “less cookies” as wrong, because—well, according to everything I know about grammar, it’s wrong. The structure of English, in this case, already makes perfect sense and there isn’t a good reason to change it, in my opinion. When speaking casually with friends, this isn’t something I would bring up, because I agree with Professor Pullum that it’s rude. However, if a friend handed me a formal letter for her boss to edit, I would most certainly circle this mistake—If someone asks me to edit something, they want my preferences, and my preferences tend to follow the book quite closely.


Point Three: English is not the best language on the planet.

“English is the most important language on the planet, he says – not because it’s better but because, by historical accident, it happens to have spread around the globe. ‘It’s not that English has won out because of its virtue,’ he says. ‘In some ways, English is highly unsuited to its role; it has 200 irregular verbs, where Swahili, for example, has none. It would have been wonderful to have Swahili as a global language, but it didn’t happen.’”

My Thoughts:

Agreed! English is incredibly difficult for a non-native speaker to learn, and different cultural nuances make it even more difficult. I would never call English the best language, or even the most beautiful. However, English has been used (and will continue to be used) to produce some of the most historically significant documents ever written, some of the most beautiful poetry ever written, and some of the most influential literature ever written. I agree with Professor Pullum: English may not be the most ideal global language, but it has become the most important. That’s why learning about how it works and how it affects us is so important to me.


Point Four: All dialects of English are just as valid and significant as any other.

“‘Other, non-standard English dialects aren’t bad or inferior. Saying that African American vernacular English is Standard English with ignorant mistakes is as stupid as thinking that Dutch is just standard Berlin German with ignorant mistakes.’”

My Thoughts:

I’m a little torn on this one, but that goes back to my philosophy about the different functions of language. Speaking is geared towards an immediate effect, and different dialects are perfectly acceptable in that case. Many people have much less control over how they speak versus how they write. So, I agree with Professor Pullum on half of his point: non-standard dialects aren’t bad, but if they carry over into how a person formally writes, I see a problem arise. I’m in no place to condemn any dialect (I live in the American South, after all), but I still hold strong to the idea that all speakers of English can write clearly and effectively on paper.


Point Five: There is a middle ground between strictly following the rules and throwing the rules out the window.

“Whenever linguists point out that the rules of language can’t be what the ‘grammar Nazis’ think they are, people claim that they’re saying anything goes. Not at all, says Pullum. ‘We grammarians who study the English language are not all bow-tie-wearing martinets, but we’re also not flaming liberals who think everything should be allowed. There’s a sensible middle ground where you decide what the rules of Standard English are, on the basis of close study of the way that native speakers use the language.’”

My Thoughts:

I mostly agree. I like to think I stand in the middle and try to look at language objectively. However, I don’t think I’m in any position to make my own rules. So, if I get confused, I’m much more likely to consult a grammar book than carve my own path and decide for myself what to do.


I like the way Professor Pullum thinks. He casts a different light on the world of grammar, which is usually shadowed by knuckle-slappers and tsk-tsk-ers. However, I’m not quite ready to follow him into the void. Without the grammar rules I know and love, I would be out of a job.

You can learn more about Professor Pullum on his website.

Lying is Wrong, Unless It’s Right: Cleaning Up the “Lie vs. Lay” Mess

That’s my dog, Sadie, lying asleep on our guest bed. Or is she laying on the bed?

I was editing a story the other day for my friend Hanne over at Heritage Press Publications for the new National Association of Christian Women Entrepreneurs (NACWE) book, Rock Bottom is a Beautiful Place. There I was, chugging along tracking changes in bright colors on the Word document, when I ran into a puzzle. One of the writers included a sentence similar to the following sentence in her story:

“I began feeling dizzy and couldn’t get back up when I lay down.”

It just didn’t sound right. It didn’t feel right. Shouldn’t it be “layed down,” or “laid down,” or even “lied down”? Surely “lay down” can’t be the past tense of “lie down,” right? My editor senses were tingling.

I turned to one of my favorite quick reference books, The Grouchy Grammarian: A How-Not-To Guide to the 47 Most Common Mistakes in English Made by Journalists, Broadcasters, and Others Who Should Know Better by Thomas Parrish. There it was in chapter 25: Lie vs. Lay. Mr. Parrish wasted no time proving me completely and utterly wrong. I guess I’m one of the ones who should know better.

It made me uncomfortable that this “common mistake” had floated under my radar for so long. How many occurrences had I missed while editing over the years? The grammar checker in Microsoft Word is usually pretty good about picking up on it, but it doesn’t always catch it.

For example, the following sentences are deemed acceptable by Microsoft Word:

“The artist proudly gestures to her sketches laying on the table.”

“The kindergartner lays on the floor during naptime.”

“Each soldier lied down his weapon during the peace negotiations.”

Guess what? All of those sentences are wrong! The only mistake Word seems upset about is this one:

“I layed down between the cold sheets, praying for a restful night’s sleep.”

It only notices this mistake because “layed” is not a word.

Confused? This is something most children learn in elementary school and promptly forget a year later. It’s a mistake high school English teachers and college professors drill into students’ heads. When students graduate and aren’t students anymore, it’s easier to pretend there isn’t a difference between lie and lay and use them interchangeably. However, if you want to be a successful writer, don’t mix these up, even when it doesn’t feel right. It only feels “wrong” because people have been making the mistake for so long and so often.

Let’s start at the beginning: to lie and to lay are two completely different verbs.

Lie comes from the Old English word licgan, which means “to be situated, to remain; to be at rest, lie down.”

Lie is an action. It is something you do. It’s an intransitive verb, which means it stands alone as an action and doesn’t need a complementizer.

Here is the conjugation for to lie:

Examples:

During our game of hide-and-go seek last night, I lay silently under the bed and no one found me.

I need to lie down because I’m feeling dizzy.

She had lain awake for thirty minutes before her alarm went off.

They are lying in the grass by the playground.

The phone lies motionless on the table.

Lay comes from the Old English word lecgan, which means “to place on the ground (or other surface).” Notice how similar the words licgan and lecgan are. It seems like our confusion dates back centuries.

Lay is an action unto something else. It’s a transitive verb, which means it always requires a direct object. You always lay something. Think of the verb to want. That’s a transitive verb because you can’t simply want; you have to want something.

Here’s the conjugation for to lay:

Examples:

When the exam was over, I laid down my pencil and breathed a sigh of relief.

She will lay her keys on the kitchen counter for you to find later.

Although Marcie and Joan had laid their backpacks by the back door after school, they were gone when they returned.

Please don’t walk through the kitchen; we are laying new tile and the grout is still wet.

You were laying a note on her pillow when she suddenly woke up.

Here’s an example of using both lie and lay in the same sentence:

As Paul was lying on the couch, Carrie laid his lunch on the table beside him.

As you can tell, our main problem comes with the present tense of to lay (lay) and the past tense of to lie (lay). They’re spelled the same and pronounced the same. How are we supposed to know the difference? No wonder we get confused. No wonder people change the past tense of lie (recline) to match the past tense of lie (fib). To keep them straight, it really comes down to memory.

Lie, lay, will lie, have lain.

Lay, laid, will lay, have laid.

It’s confusing, but that’s the English language for you.

As Bill Nye the Science Guy says…

Why should you care about such a trivial difference? That part is up to you. The more you care about how you write, the more professional and intelligent you will sound on paper. Little mistakes like these are obvious and add up quickly.

I’ve always believed that writing should be as polished as possible for three reasons:

  1. Writing is permanent. Once something is sent through an email or letter, posted online, or handed over to someone else, there’s no erasing it. It’s out there. With the exception, perhaps, of recordings, writing is much more permanent that speech. It can be read over and over, critiqued again and again. There’s a reason why, centuries later, we remember what Shakespeare wrote but little of what he said. In fact, we only remember famous quotes of his because someone else took the time to write them down.
  2. Writing is a representation of your mind. Do you want to make a strong impression on others? Write intelligently. Well-crafted writing demands respect.
  3. Writing presents the rare opportunity to edit yourself. When speaking, you can’t go back and say, “What I meant was…” or “This sentence would probably sound better if I said it like this…” Well, I suppose you could, but it would be very socially awkward. When writing, you can always go back and improve what you’ve said before anyone gets a chance to read it. Why not seize that opportunity?

That’s Just Not Right

Of all the words in the English language currently in use (over 175,000, according to the Oxford English Dictionary), “that” is one of the most abused. It’s a word we barely notice—a placeholder for better words, an unnecessary pause. When speaking, this isn’t very noticeable. In fact, “that” is a useful tool for clarification in verbal communication:

John: I can’t believe you said that.
Kate: I don’t think that you understand.

With the right voice inflection, “that” helps convey emotion and cuts corners by summing up implied information. Writers, on the other hand, often abuse “that,” making their work weaker and more ineffective. When editing, I always edit out “that” whenever I can, and you should, too.

­­­­­­­­­­­­­­­­­

All writers should be familiar with grammar mechanics. If language is a tool, you need to know how the tool works to use it to its full potential. “That” is an interesting word because it functions as five different parts of speech.

1. As a Complementizer or Subordinating Conjunction

In this case, “that” can be used to introduce a nominal clause, which substitutes for a noun or noun phrase.

Sarah demanded that her guests wipe their feet before entering her apartment.

This is closely related to its use as a subordinating conjunction, which makes one clause dependent on another clause. In this case, think of “that” as a connector.

I told her that she needed to brush her hair.

In both cases, “that” connects two independent clauses (both phrases on either side of “that” can stand alone as full sentences).

2. To Introduce a Restrictive Relative Clause (a clause identifying the referent of the noun it modifies; isn’t set off by commas)

In this case, “that” usually acts as a relative pronoun, which, coincidentally, introduces a relative clause (just like a restrictive relative clause, but isn’t restricted to the referent) or restrictive relative clause. For example, read the following sentence:

A woman who serves food in a restaurant is a waitress.

In this sentence, who is the relative pronoun and who serves food in a restaurant is the restrictive relative clause.

The word “that” works just like the word “who” when used as a relative pronoun:

The dress that she bought for the gala was too big.

In this sentence, that she bought for the gala is the restrictive relative clause. If you take it out, the sentence still makes sense: The dress was too big.

3. As a Demonstrative Pronoun

Demonstrative pronouns take the place of nouns (people, places, things, or concepts) as the subject of a sentence, but, unlike regular pronouns, they are more specific. They include the following words: this, these, that, and those.

That is funny.

That was mean, Mary.

That was too long.

To test if “that” is being used as a demonstrative pronoun, try replacing the phrase “that is” with “those are.” “Those” is the plural form of “that” as a demonstrative pronoun.

That is delicious. Those are delicious.

4. As a Demonstrative Adjective

Demonstrative adjectives are different from demonstrative pronouns because they indicate specific people, places, or things. They still include this, these, that, and those, but they are used in a different way. Let’s take a look at those sentences again:

That joke is funny.

That letter was mean, Mary.

That flight was too long.

Demonstrative adjectives have to have an accompanying noun to modify. They can’t stand alone like demonstrative pronouns. Again, the plural of “that” is “those.” You can use the same test from before.

That pie is delicious. Those pies are delicious.

5. As an Adverb (a word that modifies a verb)

Usually, when “that” is used as an adverb it conveys a contradiction to an established idea.

Imagine a college student has a big test coming up and all of his friends have told him it is impossible to pass. If, when he takes it, he finds it easier than expected, he might tell his friends, “The exam wasn’t that hard.

Or, a girl whose vacation isn’t meeting her expectations might say, “The Bahamas aren’t that great.”

In this case, “that” acts as an adverb modifying the verb “to be.”

In the first two parts of speech, “that” is usually pronounced weakly, as ðət. It doesn’t receive the emphasis of the sentence. In the other three parts of speech, it is pronounced strongly, as ðæt. The phonetics show the different functions of the word. As a general rule, if “that” is pronounced weakly in a sentence, you can leave it out. If it’s pronounced strongly, it needs to stay.

After understanding how “that” functions as a word in the English language, you can begin to understand how it is commonly misused.

Three Ways “That” is Abused by Writers:

1. It is used unnecessarily.

This is the most common problem with “that.” I see it every day when editing. When you use “that” as a complementizer, subordinating conjunction, or as a relative pronoun, more often than not you can leave it out altogether and still have a well-constructed sentence.

“The complementizer that plays no role within its clause, nor does it contribute any information.” (Klammer, Schulz, and Volpe: Analyzing English Grammar, Sixth Ed.)

When you’re able to remove “that,” do it! It will usually improve the sentence dramatically.

I know that there is work to be done.
I know there is work to be done.

The tower that Connor built out of Legos was impressive.
The tower Connor built out of Legos was impressive.
[Note: You can further improve this sentence by removing passive voice: Connor built an impressive tower out of Legos.]

The difference between a sofa and a loveseat is that a sofa seats three people and a loveseat only seats two.
The difference between a sofa and a loveseat is a sofa seats three people and a loveseat only seats two.

He won’t come back with the same attitude that He left with.
He won’t come back with the same attitude He left with.
[Note: This sentence doesn’t use proper grammar, but it’s a trend I see growing in writing. Don’t end a sentence with a preposition. This sentence should read, “He won’t come back with the same attitude with which he left.” If I were editing that sentence, I would shorten it to, “He won’t come back with the same attitude.” That helps avoid the formal sentence structure that seems awkward in a world of casual language.]

Above all, avoid “double thats” at all costs.

Some would say that that’s crazy.
Some would say that’s crazy.

I know that that’s the way to do it.
I know that’s the way to do it.

I don’t care that that’s your worst fear; you need to face it.
I don’t care if that’s your worst fear; you need to face it.

2. It isn’t used when necessary.

This is much rarer, but it’s still a problem. Leaving out “that” when you need it can make your writing extremely confusing. Even when “that” isn’t technically vital to the structure of a sentence, you need to use your best judgment to know if it should be included or not. Try reading the sentence out loud or having a friend read it for you.

For example, “that” could be edited out of this sentence:

The promise made to Cynthia was that she would get a new bike.
The promise made to Cynthia was she would get a new bike.

However, when “that” is removed it makes the sentence difficult to read. Is “was” referring to the promise or Cynthia? It’s best to leave “that” for clarification.

Try this sentence: Sarah demanded that her guests wipe their feet before entering her apartment.

I’ve diagrammed this for you to illustrate the optional “that.”

In this particular case, an argument could be made for keeping “that.” The phrase “demanded her guests” could be misunderstood because “demanded” has more than one meaning:

  1. The king demanded that his subjects bow before him.
  2. The warrior demanded an audience with the king.

When a word has multiple meanings, “that” can clear up harmful ambiguity. The last thing you want is for your readers to be confused; at that point, you aren’t a successful writer.

When in doubt, leave it out: Does the sentence still make sense? Does it still effectively get your point across?

3. It is used incorrectly.

This is a very common mistake, and it drives me crazy. Look at the following chart, adapted from one provided by the Online Writing Lab of Purdue University:

“That” is only used to refer to places, things, and ideas. When referring to people or a person, always use “who,” even if the person isn’t named specifically.

Incorrect: Don’t be like the procrastinator that says, “I’ll do it later.”
Correct: Don’t be like the procrastinator who says, “I’ll do it later.”

Incorrect: People that misuse “that” drive me insane.
Correct: People who misuse “that” drive me insane.

Incorrect: She’s the one that got away.
Correct: She’s the one who got away.

Don’t be confused and use who for everything. The only reason I can think of to use “who” for a thing would be to refer to a personified object or animal character in a story:

The fox, who never lost a race, was annoyed that the hare reached the mountaintop first.

When we speak, overusing “that” feels more natural because we can use word inflection not available when writing. On the other hand, if you stop to listen to yourself talk, you might find you use “that” less often than you think. Which of the following feels more natural for you to speak out loud?

“I think that we should go to the post office first.”
“I think we should go to the post office first.”

Likewise, on paper, the second choice is much easier to read. It takes less effort to get to the point of the sentence. In this case, “that” is a hindrance.

“That” is just not right in most cases, and should be edited out when possible.

The Comma, Our Old Friend

Almost every complicated sentence in the English language uses a comma, yet this tiny punctuation mark is often misused or neglected. Many find the task of perfecting the use of commas daunting because there are so many rules to memorize and follow. Although there are certain rules that should remain strict to maintain the fundamental structure of English, the comma can be versatile and should be allowed to change with time to reflect the current need of speakers of English.

Punctuation keeps English from becoming incomprehensible. While new words are added to the English language every day, punctuation marks have basically remained static for hundreds of years. In her book, Eats, Shoots, & Leaves: A Zero Tolerance Approach to Punctuation, Lynne Truss describes punctuation marks as “the traffic signals of language: they tell us to slow down, notice this, take a detour, stop.” Ultimately, punctuation gives a writer the tools of a speaker, like the ability to pause dramatically for the audience, raising volume for an exclamation, or inflect correctly for a question. If punctuation marks are just as important as words for expressing ideas, then each one deserves to be thoroughly studied. In this sense, punctuation marks are some of the most important “words” in the English language.

The word “comma” in Greek means, “a piece cut off.” The comma has caused writers trouble for centuries. Where a speaker can ramble on without a care, a writer must pause to consider the structure of a sentence: Just because I want a pause here, does a comma belong here? Will I look stupid if my sentence has no commas at all? Why do some commas have dots above them? (If you are seriously pondering that last question, I will discuss the wonders of the semicolon at some point in the future.)

Unfortunately, the comma can be confusing to use and, with the introduction of the Internet, relentlessly misused. Many writers are tempted to use them whenever a reader pauses to take a breath. Some writers argue that the original use of the comma was to indicate a pause, so we should be able to use it that way now. There is significant debate in the linguistic world over several issues about the comma, but most of the controversy is trivial (Let’s be honest; aren’t most debates in the grammar world trivial? That’s what makes them so fun). The primary (and most important) rules for comma usage remain untouched.

One debate in particular rises with the use of the serial comma (or Oxford comma), which refers to the comma placed in a list directly before the ending conjunction. For example, in the following sentence, the serial comma is placed before “and”: “Today, I went to class, went to work, and saw a movie.” Many writers frown at the use of the serial comma, including grammarian Steven T. Byington. “The purpose of the comma after [the first item] is to take the place of the omitted conjunction,” he says. “Consequently it is illogical to use it also after [the second item], where the conjunction is expressed.” Others applaud it for its assistance in eliminating ambiguity.

Personally, I love the Oxford comma and never leave home without it. I suppose there is a peaceful middle ground: sometimes a sentence is improved with an Oxford comma, and sometimes it just muddles it up. You decide.

This helpful infographic from Daily Infographic discusses the Oxford comma debate in a visual way:

Another common comma debate is related to the comma splice, the use of a comma to connect two independent clauses. In English, comma splices are generally regarded as grammatical errors and require editing. Microsoft Word will cram a green squiggly line all up in there. However, in her article, “A Few Good Words for the Comma Splice,” Irene Teoh Brosnahan, a proponent of descriptive grammar, defends the use of comma splices, claiming that there is a large gap between grammar handbooks and informal written English, and that the gap can be closed to bring together both styles peacefully.  She believes that the comma splice has been ignored as a legitimate use of the comma, and that comma splices are sometimes necessary to successfully convey ideas. “Even its names are tainted,” she writes. “Comma splice, comma fault, comma blunder, comma mistake.”

Of course, the comma splice is rejected even more often than the serial comma. Comma splices catch the eye because writers have been taught to find them and fix them. However, the flat rule against comma splices fails to see that, stylistically, there are instances when a comma splice is necessary to effectively communicate the meaning of the sentence. Brosnahan lists several practical uses of the comma splice, including parallel syntactical structure, lack of ambiguity, and an effect of emphasis.

Brosnahan provides several examples of unacceptable comma splices. One is, “Seymour is a polite young man, as far as I know, he never even swears.”  Because the syntax is not parallel, there is ambiguity present, and the effect is not for emphasis, the sentence is grammatically unacceptable.  This sentence is acceptable because the syntax is parallel: “Some will gain, others will lose.” “School bores them, preaching bores them, even television bores them” shows that more than one clause can be connected with a comma splice and still be acceptable.

Brosnahan ends her paper cleverly with an obvious yet effective comma splice: “Handbook writers should admit it, teachers should teach it, students should learn it.”  However, Lynne Truss professes a more realistic view: “So many highly respected writers adopt the splice comma that a rather unfair rule emerges on this one: only do it if you’re famous… Done knowingly by an established writer, the comma splice is effective, poetic, dashing. Done equally knowingly by people who are not published writers, it can look presumptuous.  Done ignorantly by ignorant people, it is awful.” Did you notice the comma splice she used (“effective, poetic, dashing”)? No? Then her use of the splice was successful. Although it is true that comma splices can be used emphatically and can be supported with grammar rules, they should be used consciously and sparingly to avoid scorn from readers.

Considering these issues with modern comma usage, writers are again faced with a choice between descriptive and prescriptive grammar. The constant exchange of information through technology means that our language is constantly evolving, bringing in new words, phrases, and, in the case of commas, structural rules. A language must adapt to changing times to embrace new generations that want to use it.  As users of English, it is important for us to understand that grammar rules are only in effect as long as the majority of the population accepts them. Language evolves so slowly that it is rarely noticeable when a change is introduced.

However, like I mentioned before, some rules are strict and should be strict. Let’s look at a sentence I copied and pasted from an article online about top shows on Netflix. The article has since been edited, but I must have read it during an early draft. Here’s a sentence about the TV show “Wilfred” starring Elijah Wood:

“Elijah Wood stars as the terribly depressed, Ryan. To make matters worse, Ryan see’s his neighbor’s dog Wilfred, as a man in a dog suit, while everyone just sees a dog. Ryan is left to watch his Wilfred, in this raunchy and irreverent comedy on Netflix.”

Ouch. Let’s ignore the misplaced apostrophe for now and focus on the commas. This writer doesn’t have a strong grasp of how the comma is formally used. Instead, he or she places a comma at every natural pause. Here’s how it should be edited:

“Elijah Wood stars as the terribly depressed, Ryan. To make matters worse, Ryan sees his neighbor’s dog Wilfred, as a man in a dog suit,. Everyone else just sees a dog. Ryan is left to watch his Wilfred, in this raunchy and irreverent comedy on Netflix.” [Note: I don’t know what the writer meant to say with the underlined portion.]

Just for reference, here is how the website edited the sentence:

“Elijah Wood stars as a depressed character named Ryan. To make matters worse, Ryan sees his neighbor’s dog Wilfred as a man in a dog suit. Everyone else just sees a dog.”

A case can be made for many grammatical “errors.”  The comma is a punctuation mark that is currently undergoing a good deal of change in the English language, evolving into a more flexible and versatile mark. Prescriptive grammarians are panicking, descriptive grammarians are nodding. However, the comma has outlasted many lost archaic marks and will continue to exist as a staple in English grammar.

If you are interested in learning more about the comma, try some of these sources:

  • Brosnahan, Irene Teoh.  “A Few Good Words for the Comma Splice.”  College English 38.2 (1976): 184-188.
  • Byington, Steven T.  “Certain Fashions in Commas and Apostrophes.”  American Speech 20.1 (1945): 22-27.
  • Cannon, Garland H.  “Punctuation and Sentence Rhythm.”  College Composition and Communication 8.1 (1957): 16-22.
  • Singleton, Ralph H.  “How to Teach Punctuation.”  College English 6.2 (1944): 111-115.
  • Truss, Lynne.  Eats, Shoots, & Leaves: A Zero Tolerance Approach to Punctuation.  London: Gotham Books, 2003.

Peculiar Punctuation: The Interrobang

Ah, the exclamation point: the joy of English punctuation. No other punctuation mark has the power to affect the volume in which we read a sentence. The exclamation point conveys intensity that a simple period can’t achieve.

F. Scott Fitzgerald, author of The Great Gatsby (among other excellent works), wasn’t a big fan of exclamation points.

“Cut out all those exclamation points,” he said. “An exclamation point is like laughing at your own joke.”

Scott sits, silently judging you and your “!”

Now more than ever, the exclamation point is forcibly paired up with its good friend, the question mark. I’m sure you’ve seen it all over Facebook and in emails from your relatives:

“Are you kidding me?!”

“WHAT?!”

“Can you believe we’re going to Paris next summer?!”

Sometimes, people get carried away:

“OMG, THAT’S SOOOO AWESOME!!!!!!!! Can I come too?!?!?!?!”

Whew. Okay, let’s take a step back. Believe it or not, it isn’t grammatically correct to use more than one form of punctuation to end a sentence. You have three choices: a full stop (period), a question mark, and an exclamation point:

What are we supposed to do now. [incorrect]
What are we supposed to do now? [correct]
What are we supposed to do now! [incorrect]

What if you had a fourth option? Some believe the combination of a question mark and an exclamation point should be acceptable and accessible:

What are we supposed to do now‽ [correct or incorrect?]

My college Creative Writing Professor called the pairing of a question mark and exclamation point “atrocious.” He said you either need one or the other, but not both. A good writer can convey panic and surprise simultaneously without tacking on an extra punctuation mark.

Let’s allow Fitzgerald to convey his point. In his novel The Beautiful and Damned (1922), Richard Caramel meets a woman named Muriel Kane, and he eloquently describes her:

Her fingernails were too long and ornate, polished to a pink and unnatural fever. Her clothes were too tight, too stylish, too vivid, her eyes too roguish, her smile too coy. She was almost pitifully overemphasized from head to foot.

I love the language Fitzgerald uses to describe Muriel, and this is a common style for him. He never would have said, “She was almost pitifully overemphasized from head to foot!” That would’ve detracted from the subtlety of his words. In the same book, Fitzgerald demonstrates why pairing a question mark and exclamation point is ineffective through a scene between the two main characters, Anthony and Gloria:

They rejoiced happily, gay again with reborn irresponsibility. Then he told her of his opportunity to go abroad, and that he was almost ashamed to reject it.

“What do you think? Just tell me frankly.”

“Why, Anthony!” Her eyes were startled. “Do you want to go? Without me?”

His face fell—yet he knew, with his wife’s question, that is was too late. Her arms, sweet and strangling, were around him…

Fitzgerald knew better than to punctuate Gloria’s cries like this: “Do you want to go?! Without me?!” The reader knows her voice is raised in concern and she is panicked; we don’t need the exclamation point to understand that.

However, many writers don’t agree with Fitzgerald and have developed a loving relationship with both the question mark and the exclamation point.

In 1962, an advertising agent named Martin K. Speckter conceptualized a new punctuation mark: the interrobang, a fusion of a question mark (or interrogative mark) and an exclamation point (known in the printing world as a “bang“). He believed his ads would look more polished if he could use a single mark to convey the meaning of two. He introduced the concept in an article in TYPEtalks Magazine, and it caught on well enough to be included in several typewriter models and dictionaries. However, by the end of the decade, the fad fizzled out.

Nowadays, you won’t find the interrobang on any keyboard. However, it’s starting to gain new momentum in an age when the question mark and exclamation point are paired up more often than ever. Now, let’s work on those sentences from before:

“Are you kidding me‽”

“WHAT‽”

“Can you believe we’re going to Paris next summer‽”

Nevertheless, the interrobang is slightly impractical as it is. It takes a complicated keyboard shortcut to type (ALT+8253) and, unless you learn the mark at the same time you learn to write, it’s difficult to incorporate into a well-established handwriting set. For me, my question marks are too curvy to accommodate a straight exclamation point, making writing an interrobang awkward and uncomfortable.

While the mark remains a clever idea, it still isn’t considered standard punctuation and won’t be accepted in an academic environment. In casual conversation, though, feel free to impress your friends with your knowledge of this quirky mark.

Descriptive vs. Prescriptive Grammar: The Ever-Brewing Battle

There are three types of people who use the English language daily:

1. Grammar Nazis

You’ve seen them. I know you’ve heard them. Everyone has. They know the difference between they’re, their, and there backwards and forwards. You are familiar with the passive-agressive Facebook comments that pop up when you use the wrong form of “your” in a status. They like perfect spelling, proper punctuation, and correcting people.

They are prescriptive grammarians. That sounds a little nicer than “Grammar Nazi,” huh? Start using that phrase in everyday conversation: “Stop being such a prescriptive grammarian!” 

Those who trust prescriptive grammar believe in the concrete qualities of language: syntax, vocabulary, and spelling. According to prescriptive grammar, individuals who speak a language do not have the right to change it without the consent of everyone else who speaks the language. It makes sense; a man in Ohio can’t suddenly decide that what we call “apples” should now be called “bananas.” If everyone started changing words and grammar, everything would fall into chaos because no one would understand each other. Fair enough, prescriptive grammarians. Point to you.

2. Teenagers on Twitter, and Those Who Talk “Lyk Dis”

Even though the days of character limits in text messages are long gone, they still don’t spell out “laughing out loud” or “oh my God.” They use the acronym “WTF” in front of their older relatives. They insert the word “like” into phrases that don’t contain analogies.

Example: goin’ 2 d movie, bbl. cm? cus. luvu
Translation: I’m going to the movie. I’ll be back later. Call me? See you soon. I love you.

Did that cause your heart to slightly contract in horror? Take a few deep breaths. Everything’s going to be okay.

Descriptive Grammarians believe in the ever-evolving state of language. They believe that a language is developed by its users. Therefore, those that use English have the right to change it to suit their needs. Language is a tool. If a tool like a lawn mower was not accomplishing its intended purpose, an inventor would redesign it to become more useful and efficient. Descriptive grammar works the same way, and is evident in the way today’s youth speak through technology. Their abbreviated spellings and curtailed sentences are more efficient in their fast-paced environment.

Some who strongly believe in descriptive grammar might argue that this kind of speaking is okay, and we shouldn’t do anything to stop it. They have a point, too. Read the following:

Fæder ūre þū þe eart on heofonum,
Sī þīn nama ġehālgod.
Tōbecume þīn rīċe,
ġewurþe þīn willa, on eorðan swā swā on heofonum.
Ūre ġedæġhwāmlīcan hlāf syle ūs tō dæġ,
and forġyf ūs ūre gyltas, swā swā wē forġyfað ūrum gyltendum.
And ne ġelǣd þū ūs on costnunge, ac ālȳs ūs of yfele.
Sōþlīċe.

Didn’t recognize The Lord’s Prayer? Not many would. That’s not some elvish language, either; that’s Old English. If you were a privileged  Anglo-Saxon in the 6th Century, you would’ve recited that prayer every day. If it weren’t for descriptive grammar, you could read it today.

The changes descriptive grammar have made to the English language are undeniable.

3. Those Who “Could Care Less”

Yes, Number Ones, I’m aware that “could care less” is not correct. Number Twos, don’t feel bad that you didn’t notice the mistake.

Some people don’t notice how they use English, and they don’t care. They just want Number Ones and Number Twos to leave them alone.

So, who’s right? Who’s wrong? I call this an “Ever-Brewing Battle” because it feels like the first two groups are always right at the brink of a conflict, but it never comes to fruition. So, they just continue on with their lives, the Grammar Nazis muttering under their breath, the teenagers rolling their eyes.

Some languages, like Spanish, have a formal ruling body that attempts to govern how a language evolves and what changes, if any, should be made. The Real Academia Española (Royal Spanish Academy) in Madrid attempts to keep all speakers of Spanish on track and make sure they don’t blemish the  language. Considering Spanish is spoken by over 400 million people across the globe, the RAE has succeeded so far in many ways. Spanish is (relatively) uniform around the world, partly due to their efforts to preserve its purity.

The RAE in Madrid

English, obviously, has no such ruling body. The effect of that shows in all English-speaking populations. Even though the United Kingdom, United States, and Australia all speak the same language, it’s not quite the same. Our sentence structure is the same, but our colloquialisms don’t match. Our English is not the same English. The funniest example I found of this during my semester in London is the word “pants.” In America, “pants” refers to the article of clothing that covers your legs. If you walk into a dry cleaning store and ask to pick up your pants that have just been cleaned, you would have no trouble. If you tried that in London, the attendant would blush and stutter while the people queuing behind you would “tut tut” because you just asked to pick up your underwear.

Prescriptive grammarians would have none of that. “Pants” would mean the same thing everywhere. New, strange slang would never be added to the dictionary.

There are several phrases that are currently being transferred from descriptive status to prescriptive status. At the end of the century, some of these grammar conflicts might be resolved:

1. Literally

This is a big one. I’ve seen calm, rational people completely snap when others misuse this word. “It’s figuratively!” Example:

#2: It was so embarrassing. I literally wanted to die!
#1: Really? Are you sure? You literally wanted dear, sweet Death to come and take you when you spilled ketchup on your shirt?

However, prescriptive grammarians are on their way to losing this fight. Merriam-Webster has already added the second definition to their American dictionary:

lit·er·al·ly

adverb \ˈli-tə-rə-lē, ˈli-trə-lē, ˈli-tər-lē\

1:  in a literal sense or manner :  actually <took the remarkliterally> <was literally insane>
2:  in effect :  virtually <will literally turn the world upside down to combat cruelty or injustice — Norman Cousins>
How ’bout that descriptive grammar?
2. Irregardless

This word first came into English in the early twentieth century, and is said to be a casual blend of “irrespective” and “regardless.” This is one of those words that sounds like nails on a chalkboard to Number Ones, due to the fact that it isn’t technically a word at all. It’s most often used as an introduction to a sentence, usually as a counteractive measure: “Irregardless, I still don’t think we need to make two trips to the grocery store in one day.” This word has become so common that it is slowly weaseling its way into the dictionary, too.

3. Their vs. His/Her

This is a really tough one. I run into this conundrum all the time when writing. In many languages, the third person singular pronouns are gender-neutral, or male by default. Spanish is a good example:

Un estudiante necesita estudiar sus libros.
Translation: A student needs to study his (HER?) books.

In Spanish, it’s perfectly acceptable to use the male form when referring to a group of people with different genders. In English, this used to be okay, but we are socially beginning to drift away from the “male only” mindset. Instead, many English speakers have started using the word “their” as a singular noun, even though it is not intended to serve as one.

This mistake is so common now, most people don’t notice it anymore. I will tell you that it is a huge pain to work around when writing. It stops me dead in my tracks every time. The phrase “his or her” is such a mouthful that I don’t like using it. I usually end up avoiding the problem altogether:

Mistake: Every citizen needs to reevaluate their political views before the next election.
Correction: Every citizen needs to reevaluate his or her political views before the next election.
Preferred Correction: All citizens need to reevaluate their political views before the next election.

Using descriptive grammar means these “mistakes” are not necessarily mistakes.

The battle simmers on, and I find myself somewhere in neutral territory, caring about the language enough to understand both sides.