The Made-Up Rules of Traditional “Grammar”
Contents
- Introduction
- Double negatives
- Negative plus plural
- Singular noun plus plural
- Split infinitives
- Shall / will
- ly-less adverbs
- Final prepositions
- More / most + comparative, superlative
- No increasing of superlatives
- Less / fewer
- He / they
- Who / whom
- Subject pronouns in weird places
- That / which
- Possessive + gerund
- Dangling modifiers
- Words which traditionalists think are wrong
Introduction
Although the traditionalists are always very stern and certain, there is no justification for any of their rules. That’s because those know-alls actually know nothing about their subject. The traditional “grammar” school began in wrong-headed ignorance in the mid-eighteenth century and has made, remarkably, no intellectual progress since. They still think that there is one correct version of the language, an ideal which even middle-class standard speakers fall short of, being known only to the chap writing the book, the Queen and maybe the Head of Gerunds at the BBC. Their only contribution has been to create confusion by giving, more than strong and wrong advice, stern and wrong orders. They have latched on to some rules either in an attempt to cleanse standard of any infection from non-standard (like double negatives) or to impose their own bizarre hang-ups, like the Victorian clergyman Henry Alford declaring a ban on split infinitives in 1864. Because their rules are made-up, it is sometimes possible, as with the Rev Alford, to pinpoint the individual who did the inventing: another example is the ban on prepositions at the end of sentences, a rule that the seventeenth-century poet John Dryden came up with.
Double negatives
“I didn’t do nothing.” “He didn’t see nobody.” “She’d never do nothing like that to no one, not never.” The double negative – or, as in the last example, the multiple negative – is one of the most loathed of all the traditional “grammar” school’s pet hates.
It's not just sticklers who write “grammar” books who hate double negatives – so do a lot of middle-class standard speakers. In some surveys, the double negative is the most despised of all non-standard, working-class features, seen as a symptom of real thickness.
The only reason for the loathing is that this is a rule where non-standard, working-class grammar differs very noticeably from middle-class standard. All non-standard dialects of English encourage double – or rather, multiple – negation, whereas standard English, always the odd one out, is the only one to opt for a single-negative system.
According to the traditionalists, it’s a question of simple arithmetic -- two negatives make a positive. If you don’t do nothing, that means you do do something. Well, no, because language isn’t arithmetic, and in language, repetition usually means emphasis, as it does with English’s double, or multiple, negatives. The clarity and arithmetic arguments break down with multiple negatives – “I never saw nobody do nothing”: shouldn’t that be okay, given that if you don’t not not do something, that means you didn't do it? Arithmetic, shmarithmetic.
Double and multiple negatives are every bit as grammatical in our non-standard dialects as they are in many other languages, like French, where negatives are based on the double “ne . . . pas” structure and indeed the double negatives in classical Greek. In some languages, like Polish, a negative statement means that every possible item in the sentence has to be negated. The best estimate is that single negation is used by only twelve per cent of languages.
Double negatives are in fact particularly useful in English with our sometimes inaudible negatives in contractions, like the /t/ in “can’t”. So they clarify as well as emphasise. Double negatives can only cause confusion by being deliberately misinterpreted – “Ahah! So you’re saying that you actually did something?” As the grammarian’s grammarian HW Fowler concedes, there is “little risk of ambiguity” with double negatives. But then there so rarely is even a hint of confusion or ambiguity with any of the “grammar” school’s hang-ups.
So – no confusion but a lot of stigma. My advice: in any middle-class standard-required context, in speaking as well as writing, take great care to maintain a single-negative system – “I didn’t do anything”; “he saw nobody”; “she wouldn’t do anything like that to anyone. Ever.” For an awful lot of standard speakers, two negatives still cause positive revulsion.
Negative plus plural
“None of your preferred networks are available” – one of the more depressing sentences of modern life, and one that Apple got wrong, according to the traditionalists, who insist that a negative word like “none” should take a singular verb: “none of your preferred networks is available”. If you want to appear ultra-correct, go for negative plus singular – “I couldn’t write the report because none of my preferred networks was available”.
Singular noun plus plural
This can get British English speakers in a bit of a tizz because, more than in American English, Brits often take some singular nouns to be effectively plural.
In every English-speaking country, singular words like “family” and “government” can sometimes go with plural verbs, and a few others, including “police”, always act as plurals. In British English, anything to do with a team takes a plural verb – “My family hate me and the police are looking for me, but at least United are two nil up.”
Split infinitives
Famously, we are supposed to not split our infinitives - ie, to not put any word between “to” and its verb. But we do. Some of us seem to always do it. Celebrity infinitive splitters include John Wycliffe, John Donne, Oliver Goldsmith, ST Coleridge, Matthew Arnold and George Eliot. Splitting the infinitive often sounds okay or even preferable to not splitting it. “To boldly go”. “To not do something.” “To never say something else”.
However, this suitability might be relatively new. That list of literary worthies looks impressive but if the split infinitive comes so naturally to us all, shouldn’t it feature just about every famous writer who’s ever written? My guess is that it used to not be so popular until its appeal began to steadily grow in the early part of the nineteenth century, because by 1864 the Victorian clergyman Henry Alford single-handedly invented a decree against it. Why? No apparent or feasible reason. The Rev Alford just didn’t like split infinitives and banned them. I’m all for them and like to sometimes use them in the knowledge that they sound nice and will wind some people up – but I think in general, it’s safest not to use them in any halfway formal sort of writing.
Shall / will
I thought this one had dried and withered and fallen to the ground some time ago, but appaz not – the shall /will nicety seems to continue to be a staple of our “grammar” guides. Here’s the state of play: the real rule is that almost all of us almost always use “will” rather than “shall”, apart from a few specific cases, such as proposing movement (“shall we?”) or proposing, in particular, a method of transport (“shall I call a taxi?”) As a normal working auxiliary verb for the future tense, used either in writing or everyday chat, “shall” is usually noticeably posh and British, being confined to a pretty elite sub-dialect of highest-rung standard in the UK. Everyone else, everywhere else, almost all of the time, uses “will” or “going to”.
The old-school rule sticks to a version of the shall/will division which either is long gone or didn’t really exist in the first place. What they say is that in “correct” English it’s “shall” for “I” and “we” and “will” the rest of the time – “I shall, you will, he / she / it will, we shall, you will, they will”. Except for emphasis when the order is reversed – “I will not be denied! “You shall go to the ball.” “He shall be king!” “We will win!”
A few seconds’ thought is enough to show that the old-school rule just doesn’t work. Take “we will win” – that is actually a straightforward statement. Who could possibly say “we shall win” as the norm? “Trespassers Will Be Prosecuted” – is that not emphatic enough? Isn’t it much preferable to the supposedly stronger but somehow pretty mimsy “Trespassers Shall Be Prosecuted”? I think that, like “I”, “shall” is often taken to be the posh, refined alternative. Hence, I’m assuming, the official announcement on the London Underground, which seems unnecessarily grand – “The next lift shall be the lift on the right”. Or the tannoy announcement I heard on a train the day before writing this – “The next station stop shall be York.” The guy opposite me raised his eyebrows. “Shall?” he said. “There’s fancy.”
Another pair of auxiliaries that bothers sticklers is “can” and “may” – “can” refers strictly to possibility or ability, they say very strictly, and “may” denotes permission. Thus the teacher’s pet response to “Can I leave the room?” – “I don’t know. Can you? Perhaps you are physically unable to move? Are you somehow incapacitated? Perhaps the whole class would like to know just what prevents … etc etc”. Tiresome.
ly-less adverbs
In standard, some adjectives add a “ly” ending when they turn into adverbs – “I am a careful person so I will do it carefully”; “she is a beautiful woman who sings beautifully”.
In all non-middle-class dialects of English, as in all the other Germanic languages, the adjective and the adverb stay the same. “It’s a beautiful house. They’ve done it up real beautiful”. “You’re a slow driver. You’re driving too slow”.
Not all adverbs do have a “ly” ending in standard but those which do must have one or middle-class standard speakers think them glaring mistakes. So glaring that even the traditional grammarians have noticed them and denounced them.
Final prepositions
See Prepositions in The Real Rules
Here is an example of a preposition at the end of a sentence to start with. This is another example to think about. And here’s another with a preposition added on. So it is undeniably true – putting prepositions at the end of sentences is something we seem to be into. Indeed, there are many times when it seems either awfully difficult or simply impossible not to have a preposition dangling at the end of a sentence. “What did you do that for?” “He is someone I look up to.” “Watch out!” “Hurry up!” Even when it is just about feasible to hoik the preposition away from the end and in front of a “which”, the results are always stiltedly formal.
So why are we supposed to avoid final prepositions? Because in 1697 the poet John Dryden decided to ban them, on the grounds that Latin did not have prepositions at the end of its sentences. The reason is that English and Latin are two different languages -- extremely different, in fact, because Latin was a very inflected language, with lots of endings and changes of word form to convey meaning and English is an analytic language which uses little grammatical words such as prepositions to perform the work done by the different endings of Latin. So English’s hundred-odd prepositions have to do a lot of heavy lifting, and in the course of their daily toil, they can sometimes find themselves at the end of a sentence, especially if that sentence is using a phrasal verb.
(See Prepositions - Phrasal Verbs in The Real Rules).
“I’m fed up.”
“What with?”
“With your sleeping around.”
“Hold on. You’ve dreamt that up.”
“No, I’ve found out.”
But let’s try to heed the traditional rule and avoid final prepositions from now on.
“Out what do you think you've found?”
“All the people with whom you have been sleeping.”
“Oh, on you come! One time up when I slipped! One time up which I messed!”
No prepositions at the end of sentences? What? This is a daft rule invented on a mad whim by the entirely misguided poet laureate in the late seventeenth century. However, it continues to thrive because the traditional grammarians have managed to keep on imposing it – helped, I think, by the results sounding so unnatural, their lawyerly formality making them appear like the kind of structure that only the most sophisticated and expensively educated of us would use. “To which department shall I send this?” “To which page shall I turn?” Plus it most often gives the chance for a similarly prim and proper “whom”: “the person to whom you are referring”; “the package for whom this is intended”. My advice – pay no heed whatsoever to this no-preposition-at-the-end “rule”. Let’s throw it out.
More / most + -er / -est
You can’t have both, say the traditionalists, although many of us do – “she’s more richer than you”; “it’s more costlier these days”; “these are the most loveliest flowers”. One form or the other is the made-up rule and it’s one that is rigorously enforced – if you use both forms together in writing, as one traditional grammarian puts it, “your readers will think you a buffoon”.
No increasing of superlatives
This rule states that, for example, perfect is perfect – it’s logically not possible to go beyond that, so you’re not allowed to call something “very perfect”.
It sounds vaguely plausible but that is actually a fairly new rule, and it would have taken everyone aback in the nineteenth century, when, for example, the phrase “a more perfect union” could appear in the American constitution. Here is a “more perfect” turning up as recently as 1919, in the prose of TS Eliot, one of the greatest writers of the twentieth century and one of the most buttoned-up and etiquetteful, who wrote, “the more perfect the artist . . . the more perfectly will the mind digest …”
Less / fewer
The traditionalists maintain that we’re supposed to use “fewer” in some cases – they often can’t explain it because they don’t really understand what’s going on, but their rule would be that “fewer” has to be used with “countable” nouns (See Nouns in The Real Rules), basically any noun that is in the plural. Hence the campaign to have supermarket checkouts changed to “Five Items Or Fewer”. Or the poster advertising a radio station with the slogan “More music, less commercials” defaced with an added “fewer grammar”.
The thing is, that’s not how it works in English and not how it’s ever worked. Here, for impeccable instance, is the Queen’s Speech, way back in 1965, when you’d have thought she and then would have known better: “It is difficult to realise that it was less than fifty years ago that women in Britain were first given the vote.” Shouldn’t that be “fewer than”? Well, clearly, no, and not because the Queen and the rest of us are all getting a rule wrong – the rule’s wrong and we’re right, and of course the Queen uses “less than” here before the “fifty years” because she’s not counting those years, she’s thinking of them as a fifty-year bloc. And when people do use “fewer than” with countables, it can sound very odd – “I’ve been living here fewer than seven months”; “he drove for fewer than two miles”; “there are fewer than five items in my basket”.
He / they
“Every reader has X own opinion”. The literary convention used to be to avoid the plural “their”, which is what most of us use in speech, and to opt for the singular masculine form. Some traditional grammarians gamely claim that this should still be the rule – “does everyone have his suitcase?” The trouble is that this really does now strike many people – for example, me – as daftly sexist and jarringly noticeable: experiments have shown that both men and women do associate generic “he” with men. How about, for instance, the “he” pronouns in the UK's Human Rights Act of 1998? (“No one shall be deprived of his life . . . his liberty . . . his civil rights”.) But even in unloaded contexts the masculine-for-everyone “he” seems crazy.
If it were up to me, I’d decree the plural “they” in writing as well as in speech, but tragically it’s not, so there are various well-intentioned schemes to stay in the singular and remain acceptable. The earliest and still-safest option is to use both masculine and feminine singulars – “each person has his or her own view”. A more recent practice has been to alternate the gender – using “she” and “her” for one paragraph, “he” and “his” the next. This is probably the least unacceptable solution at the moment. I’d still recommend the plural, by making the subject noun plural – “readers have their own opinions”; “voters have their choice”.
Who / whom
Traditional grammarians decree that we should use “whom” when it is referring to an object noun or pronoun – “She was the person whom I met”, “this is the boy whom I sold the ice-cream to”. “Whom” is used in actual speech by a small minority of speakers and doesn’t exist in any non-standard dialect. I think it now sounds ostentatiously correct, up-itself and archaic, so I’ve been pleased to see an increasing use of “who” instead of “whom” even in literary fiction from the most highbrow publishers. My advice is to avoid “whom” unless you feel compelled to use it.
Subject pronouns in weird places
See Pronouns in The Real Rules
Traditional grammars still demand that we use subject pronouns (“I”, “we”, “she”, he”, “they”) after “to be” verbs and in comparisons with “than”. The Oxford Guide to English Usage, no less, decrees that the correct answer to “Who killed Cock Robin?” is “I”, and that we should use such constructions as “it is she who has done it” and “better than I”. Crazy.
That / which
This is one of the most difficult traditional rules to understand far less follow, baffling and defeating even the most ardent of “grammar” fans. Here’s the difference explained by the great figure of English “grammar”, HW Fowler: brace yourself for bewilderment and the sad sight of a sentence being strangled by correctitude – “The two kinds of relative clause, to one of which that and to the other of which which is appropriate, are the defining and non-defining.” What the nice man meant was, “that” is supposed to go with phrases that define and pinpoint, and “which” with phrases that add extra information – “the house that Jack built fell down”; “the car that she’s driving is a Land Rover”; “the house, which Jack actually built himself, has no front door”; “the car, which I’d bought for £300 the day before, suddenly burst into flames.”
Of course, this rule has no application in reality, which is why it is so hard. In real life, “which”, and “who” are far more common in middle-class standard English, “that and “what” being preferred in most working-class non-standard dialects.
Possessive + gerund
It is an old old-school regulation that the possessive pronoun should go before the gerund (the “-ing” form of a verb which has become a noun) See Gerunds in Real Rules
“I remember your having done that”; “his being there was important; “their enjoying it was the most important thing”. Traditionalists like to think they’re maintaining an old and established convention but this structure never seems to have been used much. For some, though, it’s a wonderful signifier of one-of-usness: it “suggests sophistication,” says Simon Heffer, because “it is not a usage that comes easily to the uneducated”. (Too brainy for them. Too refined. Too superior in every way. One thinks of Jacob Rees-Mogg reflecting that he’d have escaped from Grenfell Tower.)
Dangling modifiers
Why do linguistic terms have to be so repelling? Some seem to belong in the dustiest of legal textbooks – accusative, participle, clause. Others – genitive, copula, gerunds – to the STD clinic. Fitting both categories, with a particular ooh-er-missus quality all of its own, there’s the dangling modifier, or – with another ooh-er – “dangler” for short.
Modifiers dangle when subjects get displaced or lost. “After eating the local delicacy, the museum was next on our list.” “Pausing only to wave goodbye, the car drove off into the sunset”. “Having invaded Iraq, I no longer trusted the government”. The basic rule is that thing that you’re talking about – the subject – has to be the next thing you say after the introductory phrase. As they stand, those sentences say that the museum ate the local delicacy, the car waved goodbye and I invaded Iraq. The solutions being “After eating the local delicacy, we went”, “Pausing only to wave goodbye, they drove off” and “Having invaded Iraq, the government . . .”.
Well, that’s the supposed rule but I’m not sure how current it is. Certainly, danglers are very common in writing and they are actually a very normal way of using introductory phrases in speech, so that they turn up with enjoyable regularity on the most earnest news bulletins. “Having killed and tortured thousands of people, the Archbishop of Canterbury condemned the Saudi regime”; “After eating rancid mince out of the garbage, the president sent his dog to the vet”; “Often filthy and always dangerous at night, the mayor vowed to clean up the city’s car parks.”
Words which traditionalists think are wrong
These are controversial words because they have changed, so that the traditional grammarians cling to the old meaning or use as the one true meaning or use and denounce the more recent meaning or use as wrong.
Having trawled through the currently best-selling “grammar” books, I have mustered surprisingly few words – there’s a lot of noise about them but I think there are only ten or eleven. And some of those seem to be already lost causes – does anyone still get upset about “enormity” meaning bigness rather than monstrous evil? My guess is that there are actually only half a dozen words which bother most sticklers. Of course, sticklers like to find howlers and barbarisms where’er they can, so there are others, depending on the stickler’s stickliness.
None more stickly than the UK’s leading traditional grammarian, Simon Heffer, who gets upset by our misusage of a lot of words. I haven’t included them in the main list, for reasons of sanity, but will mention a few here, for reasons of instruction and entertainment. Like all traditional grammarians, Simon Heffer’s main mission is to tell English how to behave itself and to make it stop …. actually, just make it stop. When? Well, for him, in 1928, the year linguistically blessed by the completion of the first edition of the Oxford English Dictionary, which is Simon Heffer’s bible of semantic wisdom.
Not at all strangely, the intervening years have seen quite a lot of words shift their meaning for Simon Heffer to get upset about. “Free” for example, meaning “no cost”, as in “buy two, get one free”, “a free coffee”, a “free offer” – this is a “vulgarism”; one should say “free of charge”. “Orphan” – there’s another word we all get wrong. In 1928, the OED defined “orphan” as someone who has lost one parent, and that’s good enough for Simon Heffer: what the rest of us call an orphan is in fact a “double orphan”. “Homosexual” – an adjective in 1928 and so it should still be: “a homosexual is as wrong as saying a handsome, an ugly or a tall.” “Adultery” has to involve a definitely married partner – if both are unmarried, they “are not committing adultery. They are fornicating.” . . . Hm.
Sometimes, it’s not the 1928 Oxford English Dictionary but Latin that Heffer relies on for his strange but definitive definitions. “The car collided with the tree” is wrong, he says, because “collided” refers to the coming together of two moving objects, because it comes from the Latin “collidere”, to strike or clash together. “Like so much of our language,” Heffer explains, “this is a question of logic based on the etymology; there is no perversity about it”. Well, of course not . . . Actually, it’s a nice argument, in the sense of the original Latin word “nescius”, ignorant. As “nice” shows, words can change their meaning.
And
It’s our third-most used word but the traditional rule is that in written English we shouldn’t use an “and” to start a sentence. I suppose this must have started as a well-intentioned top tip, to avoid sounding naive and repetitive, the written equivalent of a young child’s breathless “and then . . . and then”. But “and”, and indeed “but”, obviously can start sentences, if used wisely.In the earliest days of writing, “And” used to be just about the only way to start a sentence – look at the Old Testament. The same bald “And . . . And . . . And” sentence structure also appears in the earliest prose of other ancient languages, like Akkadian and Vedic Sanskrit. It seems that complex sentences, particularly using subordinate clauses, took a long time to emerge and did so when societies became more complex.
Decimate
Here’s one from the Latin-ancestry school. Nowadays, “decimate” is mostly used as a slightly exotic alternative to “devastate” – “Pompeii was decimated by the eruption”; “his affair with the plumber decimated their marriage”. However, the traditionalists maintain that “decimate” should stay true to its Latin root, “decimare”, which referred to the Roman military practice of punishing regiments by executing one soldier in every ten. Therefore, say the traditionalists, to decimate is to destroy ten per cent of something. Thus rendering it useless, one would have thought, there being no need for any such verb, there being no such practice in any of the English-speaking militaries. To specify the degree of destruction, we’ve already got “halve” and “quarter” as verbs and a clutch of quantifying adverbs – “he ruined it almost completely / nearly all of it, / most of it” – so most levels of destruction are catered for and there’s no evident need for a specifically ten per cent one. In fact, there already is a ten-per-cent verb – “tithe”, from the olden days when it usually meant a ten per cent tax. Anyway, all that plus maybe the “-ate” ending explain the quick migration of “decimate” over to the “devastate” end of the destruction scale.
Understandable and sensible, but the traditionalists think it’s a howler. To them, “decimate” has to refer to ten-per-cent destruction. Its “wrong” use signals, for some, a lack of Latin in the speaker’s school syllabus, and an IQ that’s not quite up to scratch.
Steven Pinker defends devastating “decimate”, pointing out that by the same appeal to its Latin roots December would be the tenth month of the year. Pinker writes as he preaches – in his brilliant and enormously important Enlightenment Now, here he is, using “decimate” to refer to huge rather than ten per cent destruction: “wilderness preserves are set up only after indigenous peoples have been decimated or forcibly removed from them”. Hats off to Steven Pinker, but my advice is to avoid “decimate” and go for “devastate” as the always-safe option.
Different from / than / to
For many people, this is one of the most fraught constructions in the language, because they all seem feasible. “This one is different X the other.” So which is it? Which is right? “From”, “than” or “to”?
And the answer is – true-to-bleeding-form – all three. “Different from”, “different than”, “different to” – all are possible, with no shift of meaning or even nuance. “Different from” seems to be the most popular choice everywhere. “Different to” is second in the UK, Australia and New Zealand, whereas the runner-up in the US is “different than”.
All three are fine, but the tyranny of the traditionalists’ right-or-wrong, good-versus-bad, one-correct-version attitude makes us think they can’t be.
Disinterested
For many traditionalists, this word is up there with the apostrophe as one of the most important items of our language, to be most stoutly defended against the forces of barbarism and illiteracy and the collapse of civilisation as we know it. The “proper” meaning of “disinterested”, according to them, is not “uninterested” but “impartial, neutral, objective, having no vested interest”. To many traditionalists, using “disinterested” to mean “uninterested” is a ghastly mistake – not just “ignorant bullshit” to Kingsley Amis but “depraved”.
If there is the slightest hint of a stickler alert, it is safest to say “uninterested”. If the stickler alert is sounding loud and clear, you can win great credit by using “disinterested” as clearly meaning “impartial” – “well, taking a disinterested, objective view . . .” = lots of Brownie points from a certain sort of person.
Due to / owing to
I am not sure how prevalent it is these days – hopefully, the battle has been lost. Yet Simon Heffer is still very stern about the subject in his preposterous guide, Simply English: “Due to should not open a sentence: and it can be used only when something (noun a ) is due to something else (noun b) : as in ‘mass starvation’ was due to crop failure” . . . Right. Moving on, when most sticklers object to “due to” it is because they think it should not mean “owing to / as a result of” but “being owed something”. I think – traditionalists tend to get a bit confused by their own attempts at separating the two expressions: just as Simon Heffer explained “due to” by using “due to”, Kingsley Amis swerved past a conventional explanation of the alleged misuse by appealing to the “organ of grammatical fitness” – ie, whatever sounds right, as long as you have the ears of Kingsley Amis.
Enormity
I am fairly confident that the traditionalists have lost the battle on this one, so I include it from nostalgia for the olden days when a certain sort of ultra-stickler would maintain that “enormity” is not the noun form of “enormous” but that it means something like “monstrous evil”, something that’s huge in the sense of being hugely heinous. The traditionalists have been trying to tell us that this is its proper meaning since the late nineteenth century. However, since the eighteenth century, the vast majority of people have taken it to mean “enormousnesss”. Very useful “enormity” has proved with that meaning too -- it makes obvious sense, and it’s easier to say than “enormousness” which indeed sounds so daft it doesn’t really exist.
Fortuitous
Another word which has shifted in meaning, from “accidental” to an upmarket alternative to “fortunate”. The new use is noticeably popular with football commentators who say things like “a fortuitous intervention” to offer a bit of tonal variety when what they mean is “tackle”. Traditionalists insist that “fortuitous” still means “by chance”. My pragmatic advice is to use “fortunate”.
Fulsome
This has come to mean “really full”, as in “he pleaded guilty and made a fulsome apology”, but originally “fulsome” meant something quite different – “cloying, excessive, disgusting by excess”, according to the Oxford Guide to English Usage in 1993; “excessive, insincere and given with a view to currying favour”, according to Caroline Taggart’s definition, and that’s what she and the very few others who are in linguistic step think it should still mean. Traditionalists like to lament that they are the last heroic defenders, that the cause is about to be lost, their peeves are always just about to be blown away – well, I think “fulsome” really is a lost cause, and that the only people who now think it should mean “excessive, insincere and given with a view to currying favour” write “grammar” books. Even British peers of the realm use the more recent, “very full” meaning – I can cite Baroness Boothroyd, who criticised a Conservative minister who fiddled expenses then apologised, because the apology “wasn’t very fulsome”.
Hopefully
This word has shifted in meaning as it has shifted function. It used to mean “doing something while full of hope” as in that adage, “to travel hopefully is better than to arrive”. But for the last fifty-odd years in the UK, going on a century in the US, people have been increasingly using “hopefully” as a sentence starter, meaning “let’s hope that”. “Hopefully, it won’t rain tomorrow”. “Hopefully, I’ll get off with a caution”. “Hopefully, the antidote will work”. “Hopefully, those aliens are friendly”. But to old-schoolers, this is a real howler. To Kingsley Amis, this sentence-starting “hopefully” was “never respectable”. To Simon Heffer, it is simply “wrong”, the mark of a “barbarous” writer.
Oddly, similar sentence-starters have been accepted – “fortunately”, “naturally”, “amazingly”, “funnily enough”, “regrettably”, “luckily”, “incredibly” and “wisely”. So why the kicking for poor old “hopefully”? One theory is that it’s because the sentence-starting “hopefully” was an American creation – sneered at in the UK as an alien import -- and sometimes sneered at in the US, possibly because it was a shift popular with German-speaking immigrants, who had “hoffentlich” in their native language.
My advice – up to you. Personally, I take a cautious approach, on the grounds that the only kind of attention it will get will be negative, so I avoid sentence-starting “hopefully”. I used it deliberately in the “due to /owing to” entry and after so many years of habitual self-censorship, I really had to force myself to do it.
Irregardless
One of the more recent additions to the pet-peeves list, this sets some jowls a-quiver because it is a sort of double negative, the basic “regardless” acquiring an extra “ir” at the start which, declaim the sticklers, is not only not necessary but actually, logically, makes the word mean the opposite, “not regardless”. As with the conventional double negatives, there is never any danger of confusion with “irregardless” – the extra “ir” has been added to emphasise, and obviously so.
Having said all that, I think “irregardless” mostly exists in the troubled minds of the old folk who write “grammar” books and the younger twerps who give us their “five top grammar howlers” on YouTube. Does anyone actually use “irregardless”? I suspect not.
Literally
“That is literally the worst thing that’s ever happened.” “And like, my head literally exploded.” “Literally” as an intensifier – it’s one of the pettest and peeviest pet peeves of our time. I’m guessing that it’s today’s number one, maybe replacing “hopefully” as the traditionalists’ most loathed word.
Like “hopefully”, “literally” has roused ire and outrage by shifting its function and its meaning. Originally, it meant “in a literal sense” or, especially with translations, “word for word”. These days it means something very close to the opposite of literal truth – “I am literally dying of thirst”; “I was literally comatose with boredom”.
The traditionalists have been appalled but it seems that younger people in particular use “literally” very knowingly. “And he was like literally insane with envy”; “I literally died laughing” – such statements are based on realising that the meaning is the reverse of the original and persisting with this use partly for comic or dramatic effect, partly as a gesture of solidarity (I get it and I’m assuming you get it too).