Just to illustrate what a morass "poor grammar" is, I have, wearing my ex-professional proofreader hat, gone through the article to highlight the areas where I could find fault with the article for incorrect usage.
People Who Use Poor Grammar should be: People Who Use Grammar Poorly.
I have a "zero tolerance approach" should be: I have a zero-tolerance approach.
people who mix up their itses should be: people who mix up its and it's.
passed over for a job — even if should be: passed over for a job—even if.
I don't point this out to be pedantic or to level at the author a tu quoque, but to point out that the idea of an "English grammar" that you can apply universally to writing is a myth. At best you can create a style guide, or follow an existing one. Expecting employees to be able to follow a known style guide is a reasonable request. Expecting them to score perfectly on a grammar test against a style guide they've never seen, without (presumably) computer assistance, when scored by someone who believes that they're capable of scoring a grammar test without recourse to existing style guides, is foolish, in exactly the same way that expecting a programmer to remember whether String.find take (needle, haystack) or (haystack, needle) as its arguments is a terrible interview technique.
All the examples you cite are stylistic choices and have nothing to do with "grammar". (I also personally disagree with all your suggested edits, especially the first.) Using Hungarian notation in C may be poor style, but it conforms to the C grammar.
For clarity, when I talk about "grammar" here I'm talking about what Wikipedia calls "orthography", which is what I think the OP was talking about. You're right that #1 is a style correction and #3 is a typographic correction--on the other hand #2 is unambiguously orthogaphic[0], even if it's not as well known as it's/its, and if the writer feels justified in making up "itses" to mean "it's and its", presumably because people can infer what it means from context, that puts them in a very shaky position as a prescriptivist.
The interesting point that I'm trying to make isn't that that author is wrong or right, because I don't really care much about slips in orthography. It's that once you take a prescriptivist stance you're either making an appeal to popularity or you're entering an arse-kicking contest with a seven-legged monster with no arse.
> once you take a prescriptivist stance you're either making an appeal to popularity or you're entering an arse-kicking contest with a seven-legged monster with no arse.
That's an extreme and unjustified dichotomy. The article is written in an informal linguistic register, not in the voice of a 19th century naturalist writing for the Royal Society. (Is my use of the passive voice in the previous sentence also "incorrect" because style guides frown upon it?)
Still, the article's language is clear, precise, and suggests that the author takes pains to accurately communicate his thoughts. We haven't seen the author's grammar test. I would suspect that it tests the kind of linguistic economy I'm talking about, not whether the test-taker has memorized obscure passages from Strunk and White. As such, the test is probably effective at eliminating candidates who don't give a damn about correctness.
I have a "zero tolerance approach" should be: I have a "zero tolerance" approach.
your third edit, which I also don't agree with, isn't quite right, it should be: people who mix up "its" and "it's".
fourth edit: I've always hated this jamming of words together with an em-dash: it appears that "job" and "even" have something to do with each other, but they don't.
Oh, wow, someone nitpicking this guy's grammar — never saw that coming. What an unhelpful, predictable comment.
FWIW, I don't believe the author was talking about style guides at all anyway. Rather, he seems to be discussing general and very basic principles of English.
Very basic in whose judgement? If it's the author's (and presumably yours), fine, but then it's not a business case, it's a shibboleth. If you're agreeing with the author's claim that there is a "silent majority" (which I always find a dubious argument) of people who have their confidence in your business impacted, then you're far better off investing in a process that catches these errors--many of which can be caused by typos, rather than ignorance--before they are seen, rather than relying on an up-front test that's never retaken.
Someone else on this thread made an analogy to push-ups. If, as a business owner, you feel that it's essential for your business that any employee can perform 20 push-ups at any time, is it better to test them before they begin employment, or to put them on a brief training course when they join and have regular check-ups on their ability?
I actually don't agree with the author, so I'm not going to defend him. However, I don't disagree with the author based on the fact that he made some completely understandable mistakes in his own post.
I think it's a fair, but just to simplify the argument:
Poor grammar often is an artifact of a person being inexperienced with reading, writing, or both. Either that or a sign of an unwillingness to learn. Both of these are red flags when hiring.
On the other hand, if your "zero tolerance" policy includes things like my usage of an oxford comma, chances are I don't want to work for you.
It probably depends if he's one of so many ignorant "grammar nazis" that don't understand how language works. Once you wade through the thin veneer of rule vomit, most prescriptivists have little to no clue about the language they are enforcing -- only demonstrating an ability to remember lists of uninformed, anachronistic and often ignorant rule sets that are usually intended for application in a specific milieu. They're usually foolish enough to then go and demonstrate this nonsensical approach in public. Usually as a pedantic and insufferably pretentious, high-friction social interaction: correcting minor typos, misapplying and enforcing meaningless rules, "correcting" conversational and idiomatic prose into stilted formal styles (thereby losing nuance and ultimately meaning). They then claim that it's important they do this because language is used to communicate, but they then demonstrate an almost perfect ability to use language in a non-communicative fashion.
On the other hand, even descriptivists can get a little frustrated when "too" and "to" are mixed up; "their", "there" and "they're" are conflated; "its" and "it's" are misused, and so on. It does demonstrate somebody who is unfamiliar with handling the language in the written form, but indeed the correlation between intelligence and understanding the complex English grammar in its fullness are undeniably low. It may simply be that a programmer with poor grammar has spent more time handling code (which also happens to be a milieu with very clear right and wrong feedback mechanisms for incorrect syntax and structure) than in handling written language. Feynman for example, wrote very little prose of his own.
tl;dr This employment tests only filters for people who have spent a great deal of time handling language and provides almost no information on their time handling code.
Dude, I so want to go there, apply, and write at the bottom of the page:
"No Person except a natural born Citizen, or a Citizen of the United States, at the time of the Adoption of this Constitution, shall be eligible to the Office of President; neither shall any person be eligible to that Office who shall not have attained to the Age of thirty five Years, and been fourteen Years a Resident within the United States." -- And that's the law of the land. Please try to parse that until your head explodes.
Edit: I think I would add to that "Also, in your opinion does this mean that once you have been in the country for 14 years you are no longer eligible to be President?"
Combining "not" with "and" or "or" leads to ambiguity - there are no operator precedence rules in spoken language. I remember a "eating and drinking forbidden in the lab" notice with "not (eat or drink) == not eat or not
drink" scribbled underneath (in math notation).
By Contemporary Standard American English standards comma placements are strewn throughout the sentence with very little regard for clarity (removing a few would make it clearer), and the logical relation between clauses leaves something to be desired.
Standards for comma usage weren't established until almost the end of the 19th century. Some people used breathing or personal stylistics to decide on comma use, but there certainly was no general expectation that it would follow grammatical construction or logic.
Compared to some other 18th century documents I've read, the US Constitution is downright spare with commas.
However, by the standards of English then in effect, it makes perfect sense. Part of the problem is that written English prior to the 1900s included a large element of (conscious or subconscious) verbalization of what was written.
Try reading the Constitution out loud, taking appropriate breaks at the punctuation. It becomes a lot easier to understand and read; most (but not all) of the ambiguity will disappear.
No Person except a natural born Citizen, or a Citizen of the United States, at the time of the Adoption of this Constitution, shall be eligible to the Office of President.
The grammar rule is that you can remove a clause within the commas - so this becomes:
No Person except a natural born Citizen at the time of the Adoption of this Constitution, shall be eligible to the Office of President.
So unless you were born sometime before 1776 you aren't going to be hitting the campaign trail
No Person except a natural born Citizen, or a Citizen of the United States (at the time of the Adoption of this Constitution), shall be eligible to the Office of President.
Neither shall any person be eligible to that Office [of the President] who shall not have attained to:
- the Age of thirty five Years, and
- been fourteen Years a Resident within the United States.
There, fixed it for you. It's not that difficult to understand when you know how legal parsing works. In this case, the use of two subordinate clauses at the end of the first sentence indicates that the first subordinate clause is subordinate to the phrase before it, while the second subordinate clause is subordinate to the ("super") clause preceding the third clause. The super clause consists of the first and second clauses, because they are joined by a coordinating conjunction.
No Person except [ [a natural born Citizen, or a Citizen of the United States] (at the time of the Adoption of this Constitution) ] shall be eligible to the Office of President.
The canons of statutory interpretation forbid absurd, unreasonable or unjust interpretations. I believe that ultimately this is what lets the courts discard the odd parse trees.
I think that's because the term "Oxford comma" is somewhat of a misnomer. The comma before "and" actually seems to be more preferred in the United States that in Britain.
"Opinions vary among writers and editors on the usage or avoidance of the serial comma. In American English, the serial comma is standard usage in non-journalistic writing that follows the Chicago Manual of Style. . . . It is used less often in British English,[5][6] where it is standard usage to leave it out, with some notable exceptions such as Fowler's Modern English Usage."
I prefer to use oxford comma where I'm writing a long list of items, and those items contain sets within themselves.
For instance, while listing computer brands that include Dell, HP, Acer, Toshiba, Alienware, Compaq and ASUS, I wouldn't use the oxford comma. But if I'm listing out speaker brands like Bose, Klipsch, JBL, Bang and Olufsen, and Altec Lansing, I think oxford comma helps to clarify the meaning intended.
relatedly, there's a little-used rule that lets you use a semi-colon as your list separator if the individual elements contain commas: "My favorite law firms are Dewey, Cheatem & Howe; Robinall, Widowes, & Laffin; and Sue, Grabbitt, & Runne."
In one case I discourage it as well, because with it you can't write the funniest sentence ever composed: "I would like to thank my parents, Ayn Rand and God."
Otherwise, no strong opinion, although I usually use it myself.
True! And it's also very funny. (But a shade less so than the parents sentence.) Maybe our descriptivist/purist friends should start a campaign to bring back the dative.
Right, that was one thing I thought of. Is he testing just for actual grammar rules like verb conjugation and such, or is he also including superficial style rules like "avoid the passive" or typographic things like how many spaces after a period?
Even if this ensures a cultural fit I'd not be so binary about grammar.
Just a few months ago I helped a government department find an architect. The brief boiled down to someone who can code, but can also liaise with department directors and its vendors, up to CxO level. Clearly grammar and communication ability in general was quite a key requirement.
The recruitment process involved submitting a resume and answering standard questions in an application form. There was a deadline beyond which applications wouldn't be considered.
Of all the resumes I reviewed one stood out. For two reasons.
First, his technical ability, if the resume was to be believed, was impressive. More than a government department could hope for.
The second reason was that his answers in the application form tanked. The very first paragraph answer simply stopped in mid-sentence. There were a number of spelling mistakes in the remaining answers.
I really liked the resume, so I invited him in for an interview anyway. Turns out he is in fact as good as his resume suggested.
The reason for the sloppy application form was that he only spotted the job advertisement at 4pm the day of the application deadline, and had to hustle to get his application in before 5pm. He'd heard about the project and was desperate to be a part of it.
He's been one of the best hires I've made. Being fanatically dogmatic about anything in life closes doors you'd never have imagined were even there. I hope I never become that myopic.
I've hired last minute applicants, too, so I understand where you're coming from. We are very happy to make exceptions to the policy in situations like that.
We've hired dyslexic people who have worked out brilliantly. Candidates that don't speak English natively are another exception. We try hard to hire people that can speak multiple languages. (I have found that learning a second language usually makes people better at English grammar.)
We don't expect a perfect score on our grammar test, and that test is one of many deciding factors in our hiring process. Just as we test programming candidates for programming competence, we also test them for writing skills. If you can't FizzBuzz, we're not hiring you. If you can't complete a sentence, we're not hiring you. Beyond that, we use a lot of discretion.
Many job descriptions are actually a listing of the problems that happened with the last employee.
"Efficient in MySQL" = the current SQL queries are not performing well.
"great code writing style" = the existing code is unreadable
"Understanding of HTML" = they spent more time fixing divs and attributes than writing new code
"Strong database architecture and implementation skills" = we can't afford to hire a DBA as well
"Team Leader" = there are existing employees who we don't trust in a management role
"Ability to work independently" = that last person was always asking questions that could have been Googled
There's sometimes a huge mismatch between what the employer is asking, and his needs.
For example, at my current job we're asking for an expert in HTML5, JQuery, CSS2.1, CSS3, JavaScript, Ajax, Mobile Web Development, Mobile Web Performance, Cross-Browser, Cross-Platform Development; debugging tools (Firebug or equivalent), DOM, Internationalization, Localization, Apache.
The truth is, we have one of the ugliest websites, on an awful CMS, and our webpages are on ASP (not .NET, plain old ASP circa 1999) with VBScript hosted on IIS 5.
What such a rockstar web developer will be doing here beats me (the company does pay way above average wages locally, but such a developer can work for the US).
We've already wasted somebody who was a decent web programmer (he's doing mailing lists for the Marketing department), I guess that the selected applicant will end up doing ASP pages (maybe they'll let him migrate portions of the website very slowly).
"While sloppy writing does not invariably mean sloppy thinking, we've generally found the correlation to be strong — and we have no use for sloppy thinkers."
I agree with this. There's also a strong correlation between sloppy writing and poor reading comprehension. This is usually someone who never takes time to sit down and read edited or well-crafted prose.
Why is good reading comprehension important for programmers? You see it often: A fellow programmer complains they can't find an answer to a particular problem they're having, even when the solution is staring them right in the face in the first Google search result. Good reading comprehension helps you help yourself.
The 90s are unfortunately long gone but the one good thing about this is that we could finally just NOT give ESR and his populism any more attention...
Could you expand a bit on this? I've read his Cathedral and Bazaar book, and quite enjoyed it. I didn't realize there was a whole context that I should be aware of.
He was a "big deal" in the 90s, at least within hacker circles: wrote a lot of early free software / hacker culture reated stuff & was heavily involved in the rebranding of "free software" into a form that corporate suits could feel comfortable with ("open source") & selling that idea to a wider audience. See 'The Cathedral & the Bazaar' and other works.
He wasn't all talk: way back in the day, he worked on emacs & authored fetchmail (fair warning: reading the fetchmail source will make your ears bleed.)
I think that's perfectly reasonable. Although I'm not a native speaker I quickly get annoyed at people's inability to form proper sentences. But only in formal situations, such as the job application scenario in the original article.
I went to a Barclays bank in the UK the other day to try and open a business account. One of the main reasons I didn't go with them was that the guy who was going to be my 'personal banker' could not spell the words maintenance ('maintainance') and developer ('devloper'). This just makes you seem incredibly unprofessional and unworthy of future dealings.
Once again here on Hacker News we are talking about hiring procedures for technical companies. Many people find this topic interesting, because most of us have applied for a job at least once, and many of us have been in a position to recommend someone else for a job, or to hire someone for a job. From participants in earlier discussions I have learned about many useful references on the subject, which I have gathered here in a FAQ file. The review article by Frank L. Schmidt and John E. Hunter, "The Validity and Utility of Selection Models in Personnel Psychology: Practical and Theoretical Implications of 85 Years of Research Findings," Psychological Bulletin, Vol. 124, No. 2, 262-274
sums up, current to 1998, a meta-analysis of much of the HUGE peer-reviewed professional literature on the industrial and organizational psychology devoted to business hiring procedures. There are many kinds of hiring criteria, such as in-person interviews, telephone interviews, resume reviews for job experience, checks for academic credentials, personality tests, and so on. There is much published study research on how job applicants perform after they are hired in a wide variety of occupations.
EXECUTIVE SUMMARY: If you are hiring for any kind of job in the United States, prefer a work-sample test as your hiring procedure. If you are hiring in most other parts of the world, use a work-sample test in combination with a general mental ability test.
The overall summary of the industrial psychology research in reliable secondary sources is that two kinds of job screening procedures work reasonably well. One is a general mental ability (GMA) test (an IQ-like test, such as the Wonderlic personnel screening test). Another is a work-sample test, where the applicant does an actual task or group of tasks like what the applicant will do on the job if hired. (But the calculated validity of each of the two best kinds of procedures, standing alone is only 0.54 for work sample tests and 0.51 for general mental ability tests.) Each of these kinds of tests has about the same validity in screening applicants for jobs, with the general mental ability test better predicting success for applicants who will be trained into a new job. Neither is perfect (both miss some good performers on the job, and select some bad performers on the job), but both are better than any other single-factor hiring procedure that has been tested in rigorous research, across a wide variety of occupations. So if you are hiring for your company, it's a good idea to think about how to build a work-sample test into all of your hiring processes.
Because of a Supreme Court decision in the United States (the decision does not apply in other countries, which have different statutes about employment), it is legally risky to give job applicants general mental ability tests such as a straight-up IQ test (as was commonplace in my parents' generation) as a routine part of hiring procedures. The Griggs v. Duke Power, 401 U.S. 424 (1971) case
interpreted a federal statute about employment discrimination and held that a general intelligence test used in hiring that could have a "disparate impact" on applicants of some protected classes must "bear a demonstrable relationship to successful performance of the jobs for which it was used." In other words, a company that wants to use a test like the Wonderlic, or like the SAT, or like the current WAIS or Stanford-Binet IQ tests, in a hiring procedure had best conduct a specific validation study of the test related to performance on the job in question. Some companies do the validation study, and use IQ-like tests in hiring. Other companies use IQ-like tests in hiring and hope that no one sues (which is not what I would advise any company). Note that a brain-teaser-type test used in a hiring procedure could be challenged as illegal if it can be shown to have disparate impact on some job applicants. A company defending a brain-teaser test for hiring would have to defend it by showing it is supported by a validation study demonstrating that the test is related to successful performance on the job. Such validation studies can be quite expensive. (Companies outside the United States are regulated by different laws. One other big difference between the United States and other countries is the relative ease with which workers may be fired in the United States, allowing companies to correct hiring mistakes by terminating the employment of the workers they hired mistakenly. The more legal protections a worker has from being fired, the more reluctant companies will be about hiring in the first place.)
The social background to the legal environment in the United States is explained in many books about hiring procedures
Previous discussion on HN pointed out that the Schmidt & Hunter (1998) article showed that multi-factor procedures work better than single-factor procedures, a summary of that article we can find in the current professional literature, for example "Reasons for being selective when choosing personnel selection procedures" (2010) by Cornelius J. König, Ute-Christine Klehe, Matthias Berchtold, and Martin Kleinmann:
"Choosing personnel selection procedures could be so simple: Grab your copy of Schmidt and Hunter (1998) and read their Table 1 (again). This should remind you to use a general mental ability (GMA) test in combination with an integrity test, a structured interview, a work sample test, and/or a conscientiousness measure."
But the 2010 article notes, looking at actual practice of companies around the world, "However, this idea does not seem to capture what is actually happening in organizations, as practitioners worldwide often use procedures with low predictive validity and regularly ignore procedures that are more valid (e.g., Di Milia, 2004; Lievens & De Paepe, 2004; Ryan, McFarland, Baron, & Page, 1999; Scholarios & Lockyer, 1999; Schuler, Hell, Trapmann, Schaar, & Boramir, 2007; Taylor, Keelty, & McDonnell, 2002). For example, the highly valid work sample tests are hardly used in the US, and the potentially rather useless procedure of graphology (Dean, 1992; Neter & Ben-Shakhar, 1989) is applied somewhere between occasionally and often in France (Ryan et al., 1999). In Germany, the use of GMA tests is reported to be low and to be decreasing (i.e., only 30% of the companies surveyed by Schuler et al., 2007, now use them)."
Integrity tests have limited validity standing alone, but appear to have significant incremental validity when added to a general mental ability test or work-sample test.
Bottom line: if someone is hiring for a company that produces technical documentation, a company like iFixit.com, and one feature of the product is grammatically correct writing, it's a reasonable subpart of a work-sample test to include testing for revising English prose. If someone is hiring for managing a jewelry store (a local example I know) or for building wood-frame houses, it's quite possible that a work-sample test would completely disregard the issue of correct spelling and grammar. I know a very successful owner of a jewelry store (I know him as a fellow soccer dad who once coached one of my children) who has quite dodgy spelling and grammar and punctuation, but who can communicate in written English for emailing people. I'm aware of multiple local carpenters and other people in construction businesses, including managing construction businesses, who have varying degrees of punctilious correctness in English writing, but all of them making their reputations and their livings by how they construct buildings, not by how they construct sentences. If writing is part of the work (even just for exchanging ideas with colleagues in memos or emails), sure, test it. If writing is not particularly part of the work, don't worry about it.
Great post. Thought this was interesting from the Schmidt & Hunter article:
"This meta-analysis found that the validity of GMA for predicting job performance was .58 for professional managerial jobs, .56 for high level complex technical
jobs, .51 for medium complexity jobs, .40 for semi-skilled jobs, and .23 for completely unskilled jobs."
Apparently having a high IQ is a better predictor of being a good business guy than a good hacker.
Take any "meta-analysis" with a large grain of salt (or better yet, and industrial size vat of salt).
Meta-analysis suggests that men under the age of 26 have a 60% chance of being sex offenders. This same type of meta-analysis is also the reason that all sex crime offenders (including drunk people pissing in a park and amorous couples getting it on in a dark parkin glot) are lumped together for sex offender registration purposes, because the meta-analysis suggests that the recidivism rate for "sex offenders" (regardless of actual offense) is greater than 90% (without regard to the actual recidivist offense).
TLDR: Meta-analysis can be used to support any claim.
Would you care to elaborate on what, specifically, is flawed about meta analysis? I'd definitely be interested in a source on that 60% number and how that source arrived at a number that seems, on it's face, so incorrect.
"I'd definitely be interested in a source on that 60% number and how that source arrived at a number that seems, on it's face, so incorrect."
I'd be curious to see if anyone has actually done a study on this, but to me this number actually seems quite low. I would guess that it would easily be 90+%.
META: Karl that's a tremendously massive comment. Wonder if it might be more useful as an online FAQ that you could link to? I know I lost interest about 3 sentences in, but it looks like some great material in there.
I agree on the length. Ironic how, in an article on the importance of grammar, and presumably communication, the parent while most likely using perfect grammar, fails to convince [1] simply because of the amount of the material and the way it is presented.
I've done consulting for busy people who don't want to spend time reading a long email. They want to know the bottom line, however I also find it good practice to provide backup information in case they or someone they forward the email to want to know how I came to the conclusions.
I start the email with a summary and a list of actions they can take or questions that I have. Typically no longer than a few sentences. Below that is a demarcation point to all the backup, links and further details (should they want to forward or in all honesty to CYA as well for the conclusions).
Edit: [1] Because if people don't read they won't be convinced and they won't learn anything. One of the reasons I typically avoid books by academics: to much difficult verbosity.
> If writing is not particularly part of the work, don't worry about it.
I suspect the author is (perhaps unknowingly) using his grammar test as a proxy for general IQ. General IQ correlates _very_ well with performance across a broad spectrum of tasks. (The notion that there are different IQs for different areas of life, though sentimentally appealing, doesn't correspond to reality.) It's no surprise, then, that someone with a high IQ (as measured by this proxy) would do well in a position that requires solving other complex problems, even if these problems have nothing to do with writing per se.
I was wondering about this as well. Is this a correlation number, and should I translate it as "the best tests give no better than a coin flip of selecting a good candidate"?
I know the NFL gives the Wonderlic to potential draft choices. I guess they must have done some tests showing Wonderlic is useful to get around disparate impact.
Oddly, Wonderlic scores have never been proven to have any positive link with future on-field performance. In fact, one study[1] actually found a negative correlation at some positions.
But as with many silly personnel evaluation techniques that are in common use, general managers will never get punished for doing what everyone else is doing.
1) Will he hire people who can't tell active voice from passive? (Check out the Language Log archives for how many grammar and style nazis can't tell the difference.)
2) Does he require that you can tell who and whom apart and use them in their correct cases?
If the answer is "yes" to both of these then I would assume that the people who pass the test are English majors with minors in Linguistics ;-)
> If the answer is "yes" to both of these then I would assume that the people who pass the test are English majors with minors in Linguistics ;-)
I don't know. I am not a native speaker, and am not an English major, but the answer is "yes" to both of them.
Am I understanding you correctly that you are saying people can't make out active and passive between "I wrote a letter", "A letter was written by me"?
"Who" and "whom" is a bit tricky, but I follow the rule about "who -> he"(who broke the vase? he did), "whom -> him"(whom do you trust? him). There might be edge cases I am unaware of.
"Whom" is also almost impossibly archaic and no longer considered part of modern English (having almost completely fallen out of usage around the start of the 20th century). It survives almost entirely in discussions of when to use it and almost never appears in modern writing except as a demonstration of what linguists call a "prestige form".
"To whom did you give it?" vs. "Who did you give it to?" the latter is more modern and natural in modern English.
Huh. Really? I use whom all the time, and most of my friends do as well. Granted, a) we're all snobs, and b) we don't use it in every case we should, but we still use it.
"Whom" makes you sound a bit dated. Etymologists will get the joke. ;-)
(I suppose I should explain it. "Whom" comes from a dative case, and would is not descended from a word that would be used for direct objects, but only for a subset of indirect objects and prepositional objects. However if elative uses relate to elated things, then surely dative uses in English relate to dated things, right?)
So ask yourself the question, why not "thee", "thou" and "thy" (and relevant possessive forms)? I'm not saying "whom" is wrong only that it is archaic. Reviving just that word seems rather capricious and arbitrary no?
I would say "whom" today is reserved for formulaic usage.
In 1912, Edward Sapir ("Language: An Introduction to the Study of Speech") noted the death of "whom." The example he gave was "Whom did you see yesterday" vs "Who did you see yesterday?"
Granted there are some cool things you can do with who/whom, like:
'Scots, wha hae wi Wallace bled,
Scots, wham Bruce has aften led,
Welcome tae yer gory bed,
Or tae victorie.'
Of course that's 18th century Scottish and things have changed a bit....
First, "whom" today is more or less reserved for formulaic usage. You write "To whom it may concern," but you don't say "Whom did you meet yesterday?" If you start rigidly using whom as objective (a use that Sapir noted was dying in 1912), you sound stilted and put people off. It's the sort of grammar nazi thing that alienates customers.
Similarly passive voice, split infinitives, etc. all sound like great rules until you realize that there are plenty of cases where these so-called rules are actually good to break. For example, perhaps his productivity would more than double if he would stop worrying about whether his employees split their infinitives.
I don't know. English is a second language here in India, and the grammar covered in schools is pretty basic, and still I would wager most of people would recognize it's passive voice.
The reason why that error is so common among American English speakers though is that we are mistakenly taught that voices apply to sentences and so passive voice in a subordinate clause gets ignored. So you have the issue that
"Are you responsible" is active voice but "if papers were left on your desk" is passive.
But this also gets to the problem of avoiding the passive voice which too many grammarians push, which is that there are times when the use of passive voice clarifies things.
The question, "Are you responsible if papers are left on your desk?" illustrates this very well. The questioner assumes it doesn't matter who left the papers on the desk, and there is no way to better word the sentence.
So in this discussion I have intentionally split infinitives (to emphasize that the goal is to do better at wording sentences), and illustrated why the passive voice is great, and in so doing have probably permanently disqualified myself from a job where grammar nazis rule.
In my view the ability to structure communications is important. I don't think that's the same as being a grammar nazi. For example if you have a zero tolerance policy for it's vs its, or for commas making things confusing, the framers of the US Constitution would fail the test. Not only is 'it's' is used as a possessive but try to parse this:
"No Person except a natural born Citizen, or a Citizen of the United States, at the time of the Adoption of this Constitution, shall be eligible to the Office of President; neither shall any person be eligible to that Office who shall not have attained to the Age of thirty five Years, and been fourteen Years a Resident within the United States."
Yet I doubt we would doubt the authors' intelligence, creativity, and professionalism.
The simple fact is that grammar mistakes and grammar of non-standard dialects is one thing, but an inability to structure an email or other communication is a much bigger deal. You can't fault the guy who learned English as a second language, whose native language has no gendered pronouns and gets confused all the time, and the same goes for non-standard English dialects like AAVE.
I don't know, I don't think your example sentence is particularly hard to parse, and I'm not a native speaker.
But I have to say, even as a non-native speaker, I just can't understand whats hard about "its" and "it's" and "their", "there" and "they're" and it makes me cringe every time to read such mistakes (as does confusing "loose" with "lose"). That said, I don't expect perfection but if I read a text that is littered with such mistakes and makes them consistently I won't read it because, for me, it's arduous to correct all those mistakes in my mind while reading.
I still cringe at all the "would of"s and incorrect "your"s and "their" on the Net. As an English speaker from outside the US, these aren't mistakes I commonly see in my country, and they grate. I'm like you - I find it arduous and irritating to read a passage littered with them, because each one pops up and distracts me.
That said, if I were hiring programmers, I wouldn't go by grammar. Business concerns and core capabilities come first. If the person makes mistakes but does the job best, no biggie. It all comes down to the situation in the labour market - does demand for programmers exceed supply, or vice versa? If I were flooded with good potential applicants then I'd weight grammar more heavily as a differentiating factor - but since good programmers are hard to come by currently, it's just pointless to do so.
oh the US Constitution is wonderful to throw at grammar nazis.
Article 1, section 10, clause 2:
"No State shall, without the Consent of the Congress, lay any Imposts or Duties on Imports or Exports, except what may be absolutely necessary for executing it's inspection Laws: and the net Produce of all Duties and Imposts, laid by any State on Imports or Exports, shall be for the Use of the Treasury of the United States; and all such Laws shall be subject to the Revision and Controul of the Congress."
> For example if you have a zero tolerance policy for it's vs its, or for commas making things confusing, the framers of the US Constitution would fail the test.
that's a question of knowing something about history. i wouldn't hire anyone with perfect grammar and style who was too dim to know that both are subject to change over time.
It really depends. If you've looked at 100 candidates and one guy just knocks it out the park. He's an expert at everything you need. He wrote the framework that your company relies on and wrote the book that your team uses as the reference bible. Upon contacting him one of his sentences is "Im not available next week." It would seem foolish to not hire him -- at least IMO.
Poor grammar (i wuld leik to apply 4 jorb) is an acceptable thing to use when rejecting a candidate.
With that said, I have no interest in a writing career and I'm getting tired of reading about all these new and creative ways to take interviews as far from the subject matter as possible.
And finally, anyone who considers their experience set the baseline of which everyone should aspire to isn't someone I would be interested in working for. You're a professional writer. Good for you. I'm not. How about we take your ego out of this process and actually talk about something related to the position.
As long as it's a reasonable test, you can argue its merits.
But on a site of mine, I've gotten so many e-mails complaining that I say "the data is" instead of "the data are", or arguing that either "what year were you born in" or "in what year were you born" or "what year were you born" are variously wrong, and only one is right (but they all seem to disagree on which one).
Beyond a certain point necessary for clear communication, one person's proper grammar is another's irritating pedantry.
The test covers a variety of topics, and you don't need to get anywhere close to a perfect score. And it really covers the basics: misspellings, usage of to vs two, etc. This is about basic communication, not pedantry.
Most people don't know where the dividing lines are between grammar, spelling, writing style, and typographic style. Cf. several posts on this very thread.
Applicants who don't think writing is important are
likely to think lots of other (important) things also
aren't important,
"Citation needed." Or to put it another way, what is the difference between this statement and:
Applicants who don't think wearing a tie and polished
brogues to an interview are important are likely to think
lots of other (important) things also aren't important
The difference between those two cases is well explained by the internal logic of the article (which I find spot on):
> In blog posts, on Facebook statuses, in e-mails, and on company websites, your words are all you have. They are a projection of you in your physical absence.
I agree with the statement you quoted I don’t agree with the statement I quoted. I have no difficulty agreeing with him on one point but taking him to task on another.
This practically excludes everyone who doesn't speak English as their native language. Depending on what business he's into this can be a serious handicap for recruitment.
No he says specifically that this doesn't apply to ESL or dyslexics.
As a writer he is objecting to hiring people who don't care enough about writing to use grammar correctly and he believes that this also strongly correlates with being a good programmer.
> Everyone who applies for a position at either of my companies, iFixit or Dozuki, takes a mandatory grammar test.
Whoa. Who do you think you are - google? What makes you think I am going to sit through your grammar test? If I am agreeing to your grammar test, either you are one of the most desirable places to work for(never heard of you), or economy is about to collapse and this is the only job I can find, or I am so incompetent and/or desperate that I will take anything that comes my way(beggars, choosers etc).
> Of course, we write for a living.
Oh, should have mentioned it earlier. I won't have gone into internal monologue.
> But grammar is relevant for all companies.
May be it is. But not as relevant as you make out to be.
> In blog posts, on Facebook statuses, in e-mails, and on company websites, your words are all you have.
Apart from my words, I have my intent, thoughts, opinions, facts. Words are a medium. You are giving them undue importance. If I am reading an article about face recognition using opencv, I am interested in code snippets and concepts. My mind auto-correct "there, their, they're" or "its, it's". If I am reading about "infant mortality rate in India", I am interested in figures, reasons, solutions. That is not to say grammar or writing style doesn't matter. I am saying it's not as important as you put it to be, and good writing doesn't automatically come with proper grammar.
> If it takes someone more than 20 years to notice how to properly use "it's," then that's not a learning curve I'm comfortable with.
You are assuming someone good at something is assigning equal amount of weight and is equally interested in grammar as he is in whatever he is good at. I know good programmers who write weird English. Anecdote, data etc. Neither of us have data, anecdotes don't count for much.
> So, even in this hyper-competitive market, I will pass on a great programmer who cannot write.
Don't worry about it. To pass on a great programmer, you will have to get them interested in you first. It's a win-win situation. They aren't going to flock to your offices to take your grammar test, and you won't have to pass on great programmers due to bad grammar.
> Grammar signifies more than just a person's ability to remember high school English. I've found that people who make fewer mistakes on a grammar test also make fewer mistakes when they are doing something completely unrelated to writing — like stocking shelves or labeling parts.
Citations please. Also, unless you are stocking shelves, how does it matter? Never came across a programmer whose desk is always messy(I never came across one whose desk is clean)?
> In the same vein, programmers who pay attention to how they construct written language also tend to pay a lot more attention to how they code.
Citation please. And how do you know it's not the other way round?
> And I guarantee that even if other companies aren't issuing grammar tests, they pay attention to sloppy mistakes on résumés. After all, sloppy is as sloppy does.
I am all for proof reading resumes and cover letters, but "sloppy is as sloppy does" assumes someone who is sloppy at something is sloppy at everything. That's as far from the truth as it can be.
> Grammar is my litmus test.
You must be fun to work with. A CEO whose litmus test to hir a programmer isn't programming finesse or cultural fit or drive..., but how good is his grammar.
My issue is if you're unable to delineate between "it's" and "its" and it's the language you've been speaking natively for the past 25 years, I tend to doubt your ability to know what form of "const" to use in C++. At the very least, lack of proofreading suggests to me you might be rash with your code as well.
Many disagree with me on this point of view, but routinely I've found that those that express themselves well in their native language write better code. That may be a self-reinforcing feedback loop because a lot of bad projects have a lot of terrible or non-existent documentation. And on a personal level I don't want to work with someone that communicates like a tween (NB: I'm not saying that you do, but I've run across it many times).
I also take a lot of issue with the "you know what I mean so you're being a pedant" retort. Not only is it wholly counter-productive, but it's not even accurate when you're talking about globally connected people. Quite frequently a non-native speaker will stumble upon what is written and be thoroughly confused.
> My issue is if you're unable to delineate between "it's" and "its" and it's the language you've been speaking natively for the past 25 years, I tend to doubt your ability to know what form of "const" to use in C++.
Eh, how about asking me about const-correctness than inferring from my use of "it's" and "its"? You know, you can actually talk about const-correctness instead of inferring from the grammar test.
> At the very least, lack of proofreading suggests to me you might be rash with your code as well.
It's hard to be rash with code if you aren't copy pasting. It isn't the same as posting comments on HN.
> And on a personal level I don't want to work with someone that communicates like a tween
As I already said elsewhere, you are assuming things to be binary when they are not. The spectrum between "liek dis if u cry everytim" crowd and "People with bad grammar should be hanged" people is quite wide.
> Quite frequently a non-native speaker will stumble upon what is written and be thoroughly confused.
Not an excuse for bad grammar, but a non-native speaker is fine with both "didn't go" and "didn't went". I don't see how he is going to be confused. Non-native speakers are confused when they encounter phrases, idioms and slangs they don't know. Fun exercise - ask an Indian if he/she would like to go out with you sometime. He/she would most likely say yes/no without understanding what you meant. Do you have any examples where simple grammar mistakes confuse non-native speaker? I am a non-native speaker, and am curious to know.
While I could exhaustively ask about every aspect of every language you might have to work in, it's far simpler to use a proxy. The proxy in this case is your native language and seems to work rather well. And since your job certainly isn't going to be writing code for 8 hours straight, the ability to communicate well is a good skill to have.
For what it's worth, I use other aspects to proxy fit and ability as well. If you can't bother to clean up before an interview, I have a pretty reasonable idea of how you conduct yourself.
Re: the non-native speaker part. I've had to help a lot of people out over the years on various support channels because someone wrote "then" instead of "than" or "loose" instead of "lose." Since many people seem to be able to read & write in foreign languages but not necessarily speak them, homophones are easily confused. Personally I know enough French & German to follow along with most technical materials, but easily get derailed by misspellings. Is it an idiom I'm unaware of? Is it a word I don't know? Is it some sort of slang?
> The proxy in this case is your native language and seems to work rather well.
No, it does not. The very notion that someone who always uses "its" and "it's" understands const-correctness is ridiculous.
Command over native language doesn't signify programming prowess. You are imagining things or extrapolating your personal anecdotes to absolutes. Please provide citations if it's a proven fact.
> For what it's worth, I use other aspects to proxy fit and ability as well. If you can't bother to clean up before an interview, I have a pretty reasonable idea of how you conduct yourself.
What on earth does cleaning up mean? I shower in the morning. I am not going to take a special shower for you. Whether or not I shave depends on my mood. I don't see how on earth an interview with you should affect my facial hairs. I am not coming in looking like a hobo, and anything beyond that isn't your call at all.
Why the hell people count on metric which doesn't indicate a person's qualities relevant to the job? Whatever the fuck happened to phone screening, checking up open source projects, on-site problem solving, having a conversation, checking up references. Where and why the fuck wearing a suit or taking a grammar test came into picture?
I think someone else pointed it out, it's a cultural affinity test. The shit about grammar indicating code quality is just rationalization for wanting someone who is culturally similar to you - has the same values as you. That is, someone who holds arbitrary things like grammar, facial hair, etc. in the same value as you do.
Some of my coworkers are from India and China. I could make up similar things about how chewing with your mouth open and not wearing deodorant indicates a lack of awareness of manners and means they are unaware of memory leaks and race conditions in their code.
It's on me to come to terms with people chewing with their mouth open, not fire them because of subconscious cultural supremacy issues.
No, it's as I stated initially. If you're unable to keep track of the rules of your native language, something that should be second nature, I have severe doubts about your ability to keep track of the rules in Scala, Ruby, or whatever else. If it's simply that you don't proofread or don't care to, I have no reason to believe you'll do so when dealing with code. That problem is exacerbated in dynamic languages where typos won't be caught until runtime. Both writing natural language and code are forms of expressing ideas, problems, & solutions cogently -- you just have different grammar forms for the various languages, each with their own rules. The two are even merging with things like Cucumber.
No, I don't have exhaustive studies on this. That neither proves nor disproves anything. I never stated any of this beyond my own opinion, which has been formed and reinforced by 15 years of working in open source, running & working at startups, and working at big companies.
That aside, code is typically only part of your job. Documentation, blog posts, interacting with customers, partners, team members, etc. are all part of engineering. You don't have to like it or think it's just, but people do and will form opinions about you on this stuff. Many times it won't be other engineers, which you may be fine with, but engineering alone often doesn't make a successful business. You can disagree with it or be flippant about it, but it really doesn't change reality.
> If it's simply that you don't proofread or don't care to, I have no reason to believe you'll do so when dealing with code.
Really. So all past experience on a person's resume doesn't count toward that? Every other aspect of your interview with that person cannot possibly lend anything to increase your belief that they proofread their code? That is quite irrational.
It's also irrational to assume that because someone doesn't keep track of grammar rules, that they are unable to do so. That's a pretty big mistake of an assumption.
It should be hanged, actually, when referring to capital punishment, except when drawing and quartering are involved. It's a quirk of the language; hung is correct in most circumstances.
You seem to be using the word funny in a sense with which I am not at all familiar. Again, consider the non-native user of English, and the likelihood of encountering him or her on a site like this. "Ironic" grammar policing without an irony indicator isn't at all funny.
As a non-native speaker of English, I have been wondering how much signal it would be appropriate for me to infer from the kind of sloppy grammar that is typical only for native speakers of English? [1]
I learned English at school as a second language, and we would always start with the written form, and then learn how to pronounce. So my "hash table" is primarily organized based on the written form, and it would be impossible for me to mix "two" and "too", "they're" or "their" or "there", or "its" and "it's" [2]. I hadn't even realized that "too" and "two" are pronounced the same, before a native speaker pointed it out to me, as my hash table doesn't support that kind of searches.
Also my native language (Finnish) uses a (nearly) phonetic writing, so phonetic misspelling of words is mainly restricted to people with no high school level of education, and who didn't do that well in primary school either.
I do, like the writer of the liked article, get that feeling of sloppiness, when I see those spelling or grammar mistakes in English text, but I don't really know how much signal it would be appropriate for me to infer from them?
I probably should not use as harsh standards as I do with Finnish, since those misspellings seem relatively common in English in the web.
[1] Well, typical of people who learned spoken English before written English, but in the modern world this pretty much coincides with native speakers.
[2] Well, "it's" and "its" is maybe a border case, maybe not totally impossible to mix those two, just very unlikely.
My gut tells me (there's a scientific statement, if ever there was one!) that it correlates mainly to people who haven't read much in print. Print publications tend to be edited better than online ones. Unfortunately, bad spelling is reinforced by spending a lot of time online and being exposed to misspellings that are not corrected.
As a child I didn't have access to TV, so I read everything in sight and I read constantly. The end result is I have a particularly sensitive eye for spelling and grammar mistakes. I find the misuse of, e.g., "loose" instead of "lose," to be tremendously irritating. The people I know who read a lot simply don't make trivial mistakes like that unless they're in a hurry and mistype.
Is it sloppy? I would say it's sloppy if writing is a large part of your job. Otherwise, it's mainly an indicator of someone who doesn't read print very much.
It is also getting worse at an increasing rate. I remember when mispelled words and bad grammar in reputable magazines and journals was rare, now it's almost expected that anything I read will have a few.
Absolutely agree. I would never be able to take anybody seriously (in terms of hiring them) who can't spell or get basic grammar right in my native tongue, but this tends to be quite common in written English.
How much can one infer from the fact that a native speaker mixes up "they're" and "their"?
On the other hand, I would like to know since Finnish as a language fascinates me:
How common are noun-case errors among people who have a high school education? I mean something like using the elative case when the ablative might be more appropriate?
How common are noun-case errors among people who have a high school education? I mean something like using the elative case when the ablative might be more appropriate?
Nonexistent. I don't think even preschool children ever mix those. I don't have kids, so I can't say how it is with the kids who are just learning to speak.
Also in English, you don't see people mixing "into" and "onto" even remotely as often as the mistakes the linked article talked about.
> My issue is if you're unable to delineate between "it's" and "its" and it's the language you've been speaking natively for the past 25 years,
I know the difference, and when I make a mistake I've found that every time it's been a mistake of my "finger muscle memory" when typing it rather than using the wrong word. In other words, I used the correct word, but I typed it incorrectly.
That is impossible to detect on any test where the answer is typed. Perhaps a followup to these sorts of tests are, "Here are all the places you used `its` and `it's`. Do you think they are all correct, and if not, which would you change?"
>"My issue is if you're unable to delineate between "it's" and "its" and it's the language you've been speaking natively for the past 25 years, I tend to doubt your ability to know what form of "const" to use in C++. At the very least, lack of proofreading suggests to me you might be rash with your code as well."
The difference is that const makes sense.
"Its" is the plural of "it", although we seem to use "they". "It's" is the possesive of "it", although we seem to use it as "it is". "It is" is proper English,although we tend to write it in slang "it`s". My use of punctuation respects the content of the quotations as the objects they are, although we seem to imply that "it" is spelled "it.".
I would say mixing up its and it's is a sign of a good programmer!
Apostrophe-s means a possessive most of the time, so logically belonging to it should be it's. programmers like logic, if the English language happens to have case this wrong that's their problem!
I like to amuse myself thinking of people trying to use "her's," "his'," "their's," "our's," or "your's." The possessive form is pretty consistent with pronouns.
The reasons for those are interesting - all to do with losing cases when middle English simplified and having to fit a few borrowed Norse words into Old English's German grammar.
Here's a thought: you're assuming I care equally about grammar and programming.
If I use the wrong form of "it's" in a sentence, someone might wince somewhere (possible even rage a bit). No one is going to die.
If I use the wrong form of const, it's possible that all hell will rain down upon the Earth and millions will die in horrible gut-wrenching pain, clasping their loved ones to their chest and bemoaning the gods (the desirability of such an outcome is a wholly different matter).
I am a software developer for a medical device manufacturer. Poor spelling and grammar can change the meaning of a requirement, leading to a software error. Down the road someone can certainly die.
More likely, however, is that grammar and spelling errors in Requirements, Design, Procedure or other documents, or in code commit comments, or code comments themselves implies to an auditor for a Regulatory Body that our code itself is sloppy. That leads the auditor to dig deeper. Dig deep enough and you will find enough to hang someone.
For just this reason, we must pay attention to proper documentation, down to appropriate word choice. No, we don't test for usage of written English during interviewing, but it's strongly enforced during day-to-day work.
If you don't care equally about grammar and programming, then you aren't wanted. You can no doubt latch onto a dev shop that doesn't care about that.
I don't see the problem and I admire the attitude. Use of the written word is important and, aside from the exceptions mentioned in the article, I think significantly less of people who don't write or speak well. It's important to me to work with good communicators and you can't be a good communicator if you can't speak and write well.
> Many disagree with me on this point of view, but routinely I've found that those that express themselves well in their native language write better code.
Many people with perfect grammar communicate terribly, and many people with much less than perfect grammar communicate very effectively. If effective communication is important for a job position, perhaps looking directly for that skill would be more productive and fair?
Effective communication is important for every job position I've ever encountered. Whether it was when I was a porter at Dunkin' Donuts, a research assistant, a cog in the wheel at Cisco, or running my own company. At the very least, you need to be able to communicate effectively with the rest of your team and usually across departments.
Personally, I feel that to focus on trivialities such as the difference between "it's" and "its" to be morally indefensible. It's like being unwilling to hire people who are blind or deaf, or who need a wheel chair. (Uh oh! Is "wheel chair" one word or two? Or maybe it's hyphenated–oh no! I'm going to get fired!)
There are plenty of people who are quite smart and motivated, but are borderline dyslexic. Or maybe even not so borderline. For instance, I fully know the difference between "it's" and "its", and between "there", "their", and "they're". But I can stare at a paragraph for an hour and not be able to see such subtleties. Or sometimes even notsosubtle-ies. It's quite frustrating. And now I'm allegedly a poor employee to boot. Fan-fscking-tastic.
On the other hand, I am rather galled whenever I drive by the restaurant named "Your's".
Edit: I'd be willing to bet good money that whoever voted me down is not dyslexic. And despite making such mistakes frequently, I've often been commended for my clear writing. I received the highest grade given in several years in Creative Writing in college, got above 700 on my verbal SAT, and graduated from MIT. And yet some people want to judge me on whether I might type "it's" for "its" sometimes, and then utterly discount my opinion on that issue. Oy!
I didn't downvote you, but my guess is you were downvoted for making a rather tenuous and almost insulting relationship. There's a big difference between someone with an actual disability and those that just don't care about what they write. And I don't think anyone is even advocating for a zero tolerance policy. People make mistakes all the time. But calling it a "typo" or a triviality or otherwise getting defensive really doesn't help anything. It does however suggest how you might react when a "triviality" comes up in a codebase.
I've never been diagnosed with a disability, and now I am supposed to jump through hoops to try to prove to people that I should be given some sort of special exemption? As if I need more humiliation in my life than to notice after the fact that what I've written is riddled with unintended little mistakes?
As to people caring about what they write: The litmus test should be whether they can effectively communicate, not whether they have the same arbitrary bees in their bonnets that someone else has. I might not want to hire anyone who can't give me their opinion on whether or not Frege would have discovered Godel's Incompleteness Theorem decades before Godel, if Frege hadn't been so discouraged by Russel's Paradox as to give up his entire endeavor to derive math from logic. After all, someone who hasn't put in the effort to study even the basics of modern philosophy can't be trusted to really think deeply on any issue, can they?
This, no doubt, is true, but it would also be a morally indefensible hiring practice.
As for making tenuous and almost insulting relationships: I find your claim that there is any correlation between having issues with "its" vs "it's" and the like, and the care with which one puts into their code, to be a tenuous and insulting claim. You're insulting my code-writing ability, and you've provided no evidence for your claim. That's pretty tenuous. You've also confounded caring with performance. Although it may or may not be the case that caring is a general personality trait that spans across all one's skills, performance can be quite different between skills.
As to whether such grammatical distinctions are trivial, I assure you that they are. Having some linguistics background, I feel qualified to assert this. (MIT degree in cognitive science.) If we were to all agree today to just use "its" all the time, rather than using both "it's and "its", the world would be no worse off.
Whenever you confuse the two, you cause me to have to reparse the sentence. "it's" becomes "it is" and then I have to determine that that doesn't make any sense and go back to substitute it. Vice versa. I also have a linguistics background. So I guess we'll have to agree to disagree.
> Whenever you confuse the two, you cause me to have to reparse the sentence. "it's" becomes "it is" and then I have to determine that that doesn't make any sense and go back to substitute it.
I'm sorry. When they come up with a cure for dyslexia, I'll be sure to get it. In the meantime, I suspect that most people are just dyslexic enough (dyslexia is a spectrum) that these little things cause them a great deal of trouble. It's not that they don't care; it's that if they did care, it would drive them crazy. Me, I do care about details. That's no doubt, in part, why I became a programmer.
As to throwing you off, since this mistake is so common, it would seem to be easy enough to train yourself to treat the two words as hominyms of each other. After all, you have no problem, I take it, with all the other homonyms in the language. (Yeah, I know, wheelchair ramps are annoying!) The fact that the words are pronounced identically proves conclusively that we don't need the words to be distinguished, other than by context.
As to having linguistics backgrounds, in my classes they beat prescriptivism out of me. How did yours survive?
I guess I kept mine in tact by studying other languages and having to deal with people internationally. And my fondness for "The Elements of Style" and "On Writing" fostered it natively.
I should think that dealing with people internationally would provide more tolerance for questionable grammar, not less, since non-native speakers of English typically make quite a few grammatical errors. Especially for irregular and idiomatic aspects of grammar.
Take those ignorant Brits, for example. They are always saying "different to" rather than "different from". That's much more annoying than using "it's" for "its", as it's not even pronounced the same way. And don't get my started on them misspelling "color" and "rumor".
I've mentioned this elsewhere in the comments, but you're extremely unlikely to ever see a non-native speaker write "greater then" rather than "greater than." It's a pseudo-homophone (they really are pronounced differently) that just doesn't exist to anyone picking up English as another language. More than that, it's confusing as hell for them to read because it's a form that makes no sense. "would of" and "would've" is another that comes to mind. Mixing "lose" and "loose" up seems fairly common nowadays, too.
You're right in that those not fluent in English make mistakes that seem odd, but I don't think I've ever come across a case where the error wasn't with a different tense or form of the verb. I.e., the sentence is structured a bit oddly, but it largely makes sense. The same cannot be said of "would of."
> I don't think I've ever come across a case where the error wasn't with a different tense or form of the verb. I.e., the sentence is structured a bit oddly, but it largely makes sense.
Oh come now. English, and I presume all natural languages, are replete with idiom for which it is virtually impossible for a non-native speaker to ever master. (E.g., "different from" is the correct idiom, while "different than" is not. But non-native speakers often mess up idioms that I didn't even realize were idiomatic until I hear a foreign speaker get them wrong. Things like "Put that at the table" rather than "on the table", etc.) If you learn a language after puberty, you will almost certainly never learn a language with the fluency of a native speaker. Something in our brains restructures itself after that point.
And this is why your claim is so insidious. If someone grows up in a poor community which speaks a different dialect of English from "Standard Written English", and they don't become fluent in Standard Written English before the age of puberty, they are likely screwed for life. They are biologically determined to NEVER master it to the degree that you require. But since you have a background in Linguistics, you must already know this.
This whole topic makes me rather depressed. It seems to me that anyone who promotes the idea that anything like perfect grammar is required to be a programmer--or anything other than an editor of some sort--is lacking in both compassion and ability to think scientifically. If you are willing to discriminate in hiring based on hypotheses for which there is ZERO scientific evidence, then you fail on both counts.
In fact, if I were to hire programmers, I think that instead I might test them on their abilities to be empathic and to think scientifically and logically, since these are skills that actually do matter for software engineers. Prescriptive grammarians would fail on all three counts.
By the way, do you have any idea how hard it was to get into and graduate from MIT being dyslexic? And yet now the world wants to add all these crazy hiring criteria hoops that I'm supposed to jump through??? Coding on whiteboards? Having perfect grammar? Doing handsprings and cartwheels? Haven't I proven myself enough already by getting into and graduating from one of the best and most difficult universities in the world, and then continuing on in a career with many commendable accomplishments? E.g., writing the software that configures an X-ray space telescope, and implementing the specificity scoring algorithm for RNA Interference hairpins; knowing how to program in a dozen-plus programming languages; etc. The increasing tendency towards myopic monoculture and inflexible hiring practices infuriates me. I consider it a personal attack on my mental and financial well-being. And it is!
Regarding your claim that "would of" makes no sense, I'm not sure how to jibe this with your claim of having a linguistics background. The first thing that one learns in Linguistics, which is the scientific study of how people actually do speak, as opposed to the unscientific field of prescriptive grammar, which aims to tell people how they ought to speak, is that if a community of people speak or write a certain way, then it always makes sense, and there is always a good cognitive reason for it.
That's not to say that the good cognitive reason always maximizes functionality, but the same criticism can certainly be made about prescriptive grammar too. A lot of the rules in prescriptive grammar are completely arbitrary, and don't reflect the real language, as actually spoken.
A clear example of this is the deprecation of double negatives. This prohibition is a modern invention, invented by Bishop Robert Lowth in 1762. Before that everyone in English said, "I don't have none", just as they do in Romance languages. There was nothing wrong with the sentence, "I don't have none" before 1762, and there is nothing wrong with it today, except to the extent with which you wish to distinguish yourself as part of the upper class. This cynical reason is the very reason that most of Lowth's silly invented rules were adopted by the upper class. There was nothing better about the way Lowth wanted people to speak compared to the way they had spoke before, other than as tool of class distinction.
Another example is the word "ain't". The sentence, "Ain't I a genius?" is actually grammatically correct. Contrast this with how most people now will say, "Aren't I a genius?" which is just plain wrong. Why do people make this mistake? Because prescriptive grammarians beat "ain't" out of them, which caused people to start using something wrong. Sure, now the "correct" thing to say is "Am I not a genius?" But what kind of pretentious idiot would say that?
As to what sense there is to be made of "would of" and "it's" vs "its": This clearly expresses a natural human tendency to desire a more phonetic written language. Spelling in English is just plain dumb. Don't take my word for it--just ask the great author George Bernard Shaw.
I have friends from Italy, and they tell me that there are no such things as spelling bees in Italy. I.e., if you know how to pronounce a word, you know how to spell it. What an enlightened language! Perhaps without having to devote so much of their brains to keeping track of the spellings of tens of thousands of words, they can use it for more productive endeavors. I guess I have no scientific evidence for this, but I do hear that Italians are better lovers. That's gotta count for something!
tldr: A pox on the anti-scientific prescriptivists and obstacle-course-of-arbitrariness-makers of the world.
Perhaps standard grammar is not, as you argue, the best measure of semantic correctness. Then please, let us violate standard grammar wherever it improves accuracy.
Writing something different from what you mean because the two sound alike does not qualify.
I'm fond of Bertrand Russell myself. Does your argument stand on its own?
If we were to replaced both "it's" and "its" with just "its", absolutely no expressiveness in the language would be lost. This is shown by the fact that in spoken English, there is no audible distinction between these words. You can tell which word is being used 100% of the time that they are used in a sentence by the context alone. No sane person ever argues that we should introduce an audible distinction between "it's" and "its" in order to make a more expressive spoken language.
I don't know what Bertrand Russell would say, but I know what the great writer George Bernard Shaw would say. He lobbied very hard for a transition to a completely phonetic alphabet. I think he sometimes even wrote using it.
That's completely hypothetical. I could say "cat" and "dog" should in the future both be called "dog." That doesn't really excuse calling cats dogs.
Perhaps a new standard of spelling could be created and, as you suggest, "it's" and "its" could be replaced by "its." But that's not what you're doing when you write "it's" instead of "its;" instead, you are writing a word that is different from the one you mean.
"Its" has a meaning now, today, that whoever parses you language will use. Perhaps you will convince everyone to overload that meaning with a new one, but in the meantime write what you mean.
In the meantime, I'm dyslexic, and you don't seem to be very considerate of that.
Additionally, what I said is not hypothetical. It's a linguistic fact. Your comparison of phonetic spelling to replacing "cat" and "dog" with one word is a linguistic falsehood.
Perhaps someday falsehoods will be as accepted as facts, but in the meantime, speak the truth.
Writing "pier" instead of "peer" is, just like writing "cat" instead of "dog," wiring something different from what you mean. Your reason for confusing "pier" and "peer" doesn't make them the same. Perhaps it's hard to get the right answer, but that doesn't make a wrong answer right.
As for your dyslexia, I don't think it's relevant. Your case for phonetic spelling wasn't specific to dyslexics.
If "pier" and "peer" were merged into one word, then there would be sentences that are ambiguous. No such ambiguities arise with "its" vs "it's". They aren't even the same parts of speech.
You also have an incorrect notion of right and wrong. Read up on some linguistics and get back to me. E.g., there is nothing more "right" about "Standard Written English" than there is about the dialects spoken in inner cities. Both are equally good, full rich languages. Though certainly mistakes can be made within a dialect, you must understand, in order to be a civilized human, that for many native English speakers, their native language is not Standard Written English, but rather a regional dialect. Consequently, when they are in a situation where they are expected to communicate in Standard Written English, they are being expected to communicate in a language that is not their native language, and yet many people will not afford them the allowances that we make for people whom no dialect of English is their native language.
As for my dyslexia, it certainly is relevant, as this entire thread, from when I started posting on it, which is a direct ancestor of this post, and yours, is about whether it is moral to deny people jobs just because they make such an unimportant mistake. It's a mistake that never actually matters, except to the pedantic, and it's a mistake that dyslexics are particularly prone to. Not because they don't understand the difference, but rather because they often can't see when such mistakes are made.
This is very much a cultural fit test. If people who use poor grammar are going to drive the author nuts, then as the the person running the company I think he's perfectly justified in treating the candidate as a poor cultural fit.
Additionally, as someone who employs writers as well as programmers, he might reasonably want to hire a programmer who is also a writer for the same reason that you'd want to hire a programmer who also has ops experience: it makes them a more appealing hire for the kinds of problems his startup is trying to solve.
You don't see the problem because you look at it from a different direction.
Hiring good programmers is already hard enough. And you are likely to hire a bad programmer despite having them run through many programming interview sessions.
If you put all sorts of irrelevant criteria in between, you are likely to get people who have mastered the art of beating interviews by specializing in those criteria and diverting your attention from actual issues. Then we all go back and complain why its so hard to hire 'good' programmers.
The OP generated a lot of strong negative reactions. But it also generated some strong positive reactions from programmers who share the author's frustrations and would enjoy working at a place where everyone punctuates impeccably. Like the author, my experience has been that programmers who care about language tend to write beautiful code.
I'm not looking for a job. But if I was, this article would make me more, not less, likely to apply.
I'm an owner at a software dev shop now, and I loved this article. I can't imagine actually using grammar as a litmus test, personally, but it definitely goes into my on-the-spot judgement criteria. I feel significantly less-comfortable working with someone who lacks the critical thinking skills or personal professionalism to be intensely concerned with how they interact with others - and grammar is significantly about that.
If you can't be bothered to learn grammatical syntax, and it's something you've done your whole life, why should I expect that you'll be bothered to pursue perfection in other areas? If you are bothered to pursue it, why should I believe you have a chance at approaching it if you haven't nailed something you've done since you were five years old?
Anyway, I've not taken the time till now to really develop my understanding of this flash judgement I make of people, but it has always really bugged me. I have a friend who some people view as 'redneck' because he's got a bit of southern good-ol-boy in his heritage, but he's been reading books since he was young and it shows. We took a chance on him, brought him in-house, and trained him. One of the reasons I knew he'd become a great developer was because of the attention to detail he gave to everything that he did. It played out extremely well, for us and for him, and that one signal was a huge part of it.
Obviously anecdotal; typical caveats apply. Still, I'd rather live in the world iFixIt's pursuing than the world we presently live in, re: grammar.
EDIT: hilariously, for grammar. I missed a question mark.
Writing skills are clearly a BFOQ for an organization where your entire work product is written; there is absolutely nothing illegal about a writing test for a programming job.
It's not a sneak test - it's an actual grammar test. If you are asked to document something, and you can't observe basic english grammar, I see a problem too. Zero tolerance may be taking it a bit too far, but one would imagine he can tell the difference between a typo in a big test and actual fundamental lack of knowledge of the grammar you should have learned by the time you were 14.
(Yes, the double space after the period is a built in habit from typing class, and a debatable item in the computer age, but nobody seems to care one way or the other.)
Typographers do! Double-space is OK for monospace fonts, but otherwise single-space is almost universally preferred for avoiding so-called "rivers" of whitespace.
Indeed. I'd point out that LaTeX does kerning independent of number of spaces after a period. I usually put two spaces when I type because I do a lot with monochrome fonts including all my LaTeX sources, but really, it is no substitute for proper kerning.
> I'd point out that LaTeX does kerning independent of number of spaces after a period.
This is misleading. Yes, LaTeX ignores how many spaces you put after the period in the source; but it does lay out extra space after sentence-ending characters. To avoid the extra space (e.g. if you have a mid-sentence abbreviation) you have to backslash it!
> (Yes, the double space after the period is a built in habit from typing class, and a debatable item in the computer age, but nobody seems to care one way or the other.)
Well, it does serve a purpose in the computer age - grepping for sentence ending.
Also, it makes it easier to determine when a sentence ends that contains an abbreviation. (This refers to when most people would use periods at the end of abbreviations.) I still use double spacing even though standard HTML does not display the double spacing on web pages.
The following sentences are an example of what I was saying:
"I sent the N.D.A. to N.Y. It arrived three days later."
The single space after the "A" in "N.D.A." indicates that the sentence has not ended even though there is a period.
The harder case is an [A-Z]. [A-Z] which is not a sentence ending. Consider the following:
"When we compare the use of mistakes in speech in the Saga of St. Olaf with some of the other Old Norse literature, we can see that there is, in fact, a pattern of use of errors in speech being a sort of death omen."
The single space there identifies that there is no end of sentence between St. and Olaf. Of course when you are working in LaTeX you have to tell LaTeX that this is a word break and not a sentence break or else it will kern it as an end-of-sentence.
As a programmer - I can't stand working with coworkers who have lousy grammar and communication abilities. Most of my work either is communication, or will require communicating what I did to someone else, or communicating with someone else about what they did.
I can endure the relaxed grammar texts and IMs can use, but in practical & formal terms, I want decent writing and competent communication . I absolutely have zero interest in working with someone who has pathologically bad grammar & writing skills. We need to talk to each other, and grammar is a key component in unambiguous written communication.
Me too. And it does affect their programming in some obvious ways. Particularly if you are working with English language-heavy languages like Javascript. But even if you aren't, comments that are absent or unreadable can make working with someone's code a nightmare. Forget about documentation. And I'll never forget having to work with some code where a variable was "pubic_key."
What's so bad about a grammar test? Any competent person (excluding extenuating circumstances: dyslexia, etc.) should be able to pass a simple grammar test. It's not some hardship that you have to study for or anything.
It's more like I am applying for a programming job, and I don't see why I have to subject myself to your whims.
Consider that instead of grammar, I am advocating I will only hire people who can do 50 push-ups. After all, what can I expect from people who are sloppy towards something as important their own bodies. What's so bad about 50 push-ups? Any fit person(excluding extenuating circumstances) should be able to do 50 push-ups.
The thing is, I don't see the correlation between grammar and the job I am supposed to do, but more importantly, by doing that, you aren't warranting me the respect I believe I deserve. I have a particular job and screening procedure in mind. If you ask me to skip rope, irrespective of my ability do so, I will have to deny.
What? If you're writing software being able to produce clear documentation, write-ups, and other supplemental materials is quite important. In some cases, far more important than writing the actual code.
In some hires I've used a descriptive writing assignment, where I ask the candidate to describe in words what they are looking at. You can learn much from these things, including: analysis skills, close looking, how they think, and, well, how well they would document their software.
> What? If you're writing software being able to produce clear documentation, write-ups, and other supplemental materials is quite important. In some cases, far more important than writing the actual code.
You are taking it to the extreme where there are only 2 states: grammar purists and monkeys banging at keyboard. Someone who mixes up "its" and "it's" is capable of all the important stuff you listed. I mix "its" and "it's" a lot while typing. I know the difference, my fingers don't.
> analysis skills, close looking, how they think, and, well, how well they would document their software.
And none of it is a result of good grammar. Are you implying that somehow good grammar leads to good analytic skills? Or are you saying talking with them gives you a window to their mind? If the latter, the window is wide open irrespective of how often I mix "it's" and "its". And how I prefer my period outside of quotes or parens(Why? Because I like it that way. That's why).
So teach yourself not to make the mistakes because the next person in behind you working on your code might not know what you mean. If you know the difference it's just laziness to not communicate correctly.
As for periods outside of parentheses or not, that's syntax, not grammar. Your position here is weak. Good grammar is an indication that the person wielding the language at least takes care over what they're doing. A valuable trait for anyone, no matter what.
> Good grammar is an indication that the person wielding the language at least takes care over what they're doing.
Good grammar is an indication of good grammar, and that's that. If you believe it's otherwise, well, your believes aren't fact. Please provide citations.
"...If you're writing software being able to produce clear documentation, write-ups, and other supplemental materials is quite important. In some cases, far more important than writing the actual code."
Does the company in the original post not recognise the function of an editor?
Absolutely you do not have to subject yourself to his whims. You would be free to walk out the door if you found the grammar test offensive, or the push-ups test, or the rope-skipping test.
You do not deserve respect. Ever. You must always earn it.
> Absolutely you do not have to subject yourself to his whims. You would be free to walk out the door if you found the grammar test offensive, or the push-ups test, or the rope-skipping test.
> You do not deserve respect. Ever. You must always earn it.
There are multiple types of respect. "One human being to another" is the basic type. Everyone deserves that. Another is earned. And the last one is context dependent.
If I am coming in for a programming job, there are expectations and norms. I come in and you ask me to unclog the toilet since you read in some blog about some CEO doing it or it represents loyalty or commitment or whatever the fuck, I am free to walk out and I will walk out. But that doesn't excuse the fact that you didn't warrant me the respect I deserved.
Oh god. I hate this attitude. Respect is a right, not a privilege. Why should anyone have to earn something so basic? Do we also have to earn our privilege to breathe air? We're all equals and equally deserve respect. Yes, even you after you muttered that banality.
I would say that we're definitely not identical, but we can all be equal.
A better question, though I will answer yours shortly, is what does equality mean? It's simply the relative value we assign something. The thing is, we can think in terms of valuable and not valuable or we can skip the duality altogether and see everything on an equal level. A spade is thoroughly unvaluable if you have no hole to dig yet indispensable when you do. So what is it's absolute value? Is it equal to a saw? Is one person equal to another? It all depends on your perspective.
The problem with seeing inequality between people in terms of respect is that you will treat some of your fellow man badly, since you don't respect them. If their value is merely a matter of perspective, which is ever changing, would it not make more sense to see past our biases and show everyone respect, whether they are equal to what we currently identify as respect worthy or not?
To me, respect means to consider someone in high regard. In my experience, how I look at people ends up affecting how they act. People always tend towards living up to the expectations placed on them. By showing someone that I don't respect them, I am pushing them down rather than helping them up.
Putting it another way, when someone is on the defensive, because someone else looks down on them and does not show them respect, they close up and are not receptive to new information. By showing them respect even if they are bad at something, they are likely to listen to what you have to say and change how they act as a result of your words. Disrespecting someone with - "you type like an idiot" or ignoring them, will make them respond appropriately. Validating their actions - "cool, a more condensed form of English!" then suggesting something else - "I've had lots of success finding a job by writing like this", they might actually change.
If we want society (and spelling) to improve, we have to embrace those we call idiots and teach them, not push them away.
If someone is an awesome programmer but can't write English to save his life, why not educate him? You'll certainly have a more loyal employee if he feels that he's gained important skills from you.
tl;dr - Disrespect is damaging, respect is nurturing. Respect everyone and the sun will shine brighter.
> You do not deserve respect. Ever. You must always earn it.
Hm. If you don't have my respect, I may well feel justified in stealing from you. Is that really the kind of 'lack of respect' you think should be the default state?
What's so bad about it is that unfounded assumptions are being drawn from the performance of said test.
Any competent person should be able to make a bed neatly and quickly. Would you apply for a (non-hospitalitly-related)job that required you to demonstrate your bed-making abilities?
Watching how someone makes their bed shows their behaviour towards paying attention to detail. Even if they don't know how to do it, if someone chooses not to do or chooses to give up, it's clear sign that the person will probably not do something out of their domain.
If you're in a startup, you wear many hats. You do what it takes to succeed. If you have no clue how to do sales, find a way.
>Watching how someone makes their bed shows their behaviour towards paying attention to detail. Even if they don't know how to do it, if someone chooses not to do or chooses to give up, it's clear sign that the person will probably not do something out of their domain.
I'm really not trying to be argumentative here, but this seems really silly. Replace "make a bed" with something like "kick a 40 yard field goal" or "craft a wooden chair" or more simply "tie a Windsor knot". All your testing is whether they care about that particular task that is in no way related to skills you actually need.
Sure it might sound silly but sometimes it's not whether you can actually do the task or not. It's more of how will you approach something that you are not expected to do. Different people have different approaches.
I agree that if you're hiring a programmer strictly to program and do nothing else, that's fine. But if you're looking for someone who thinks outside of the box, is capable of coming with ideas and solutions when the solutions are yet to be known, having tests like these is probably a way to figure out how a person thinks. Whether it's a good way or not, I'm not so sure as I've never had to administer one of these tests.
The idea of using grammar as a litmus test for all new hires is akin to saying that nothing is more important than natural language grammar. That could not be further from the truth. I've read plenty of extremely thoughtful, info-dense specs in the open-source community that had typos and grammar errors. I've seen countless emails from colleagues that were clearly quickly written and thus contained errors. The essential ideas were none-the-less transmitted. My mind is able to auto-correct when needed. Perhaps if the rules of english grammar were based on logic and reason rather than the memorization of arbitrary, capricious rules I'd be willing to attach more wait to the grammar skills of non-professional writers.
> I've read plenty of extremely thoughtful, info-dense specs in the open-source community that had typos and grammar errors. I've seen countless emails from colleagues that were clearly quickly written and thus contained errors. The essential ideas were none-the-less transmitted.
And because people tolerate it and allow for this to happen, we live in such ugly world, where people don't give a damn about quality.
Similarly, nobody has fine tapestries on the walls of their server rooms. Nobody cares about the insulation properties, apparently, but worst of all, nobody cares about the attention to detail and quality a fine tapestry represents. Thus we live in a terrible world with no appreciation of the finer things, because if you don't mind the lack of tapestries, how can you possibly care about a lack of code quality?
Getting basic grammar poorly is not like having no tapestries on walls of server rooms. It's like having a big mess in server room, where machines lie around in unorderly fashion and cabling is a mess. Yes, it works. But it looks terrible. Also add unfinished pizzas laying on the floor for accurate representation of people who don't care about spelling.
> If you're writing more code than natural language (documentation, discussion of specs, interaction with the team, etc) something is extremely wrong.
As long as we are just throwing claims around, I posit unless the code you write isn't 99% of what you write, something is extremely fucked up, and the orcs will take over the world.
I don't see how your claim is more valid or invalid than mine. None of the claims are backed by data, and personal perceptions are, well, personal. No one other than person experiencing it gives a damn.
Perhaps, except that hypothetical code-hungry orcs are imaginary and have nothing to do with anything while the value of any of my very real examples is well understood by pretty much anyone. (Though - if you'll forgive me drawing further from our mystical beastiary - trolls might be an exception?)
Edit to explain why I'm so dismissive:
I think that it's a fairly accepted axiom that specified and documented projects are easier to maintain than the alternative. Likewise with code size versus feature set. I think it's self-evident that teams that communicate in natural language (even if only via a ticket system) are more functional (or at least more tolerable to be a member of) than teams that do not.
Deriving my claim from these seems reasonable enough, to me, given the context of the discussion (a comment thread of a relevant topic on some website a minority of people care about).
Not everything is science and while it might be nice to have 5-sigma data to reinforce my opinon, it fortunately doesn't need to be so reinforced in order to be valid, or even valid to be worth sharing.
> Perhaps, except that hypothetical code-hungry orcs are imaginary and have nothing to do with anything while the value of any of my very real examples is well understood by pretty much anyone.
What does this even mean? I don't see any real examples, and all you are doing is throwing more claims around. I missed the memo where you, and whoever this "everyone" else is were appointed authority on the value of anything for everyone else.
> Edit to explain why I'm so dismissive:
I think that it's a fairly accepted axiom that specified and documented projects are easier to maintain than the alternative.
So a project with beautiful documentation and totally retarded code is easier to maintain? Documentation, more often than not, is for the end user. As far as code maintenance goes, the most important factor is proper abstractions and encapsulations. If you wrote a 5000 line, well commented method, it doesn't help me at all.
And a very specific set of projects lead itself to and require beforehand specs. Majority of the real world runs on "code is spec". Where is the spec for linux? Here is a little unknown someone's views on specs http://kerneltrap.org/node/5725 Where are the specs for rails, sinatra, django, flask? And how would it help if suddenly a rails specs came into being? You are confusing your little well with the world. Most projects design interfaces, not specs(activerecord, rails 4 queuing api etc)
Even your axiom holds(it doesn't, at all), how does that imply if you're writing more code than natural language (documentation, discussion of specs, interaction with the team, etc) something is extremely wrong.?
> Deriving my claim from these seems reasonable enough, to me, given the context of the discussion
> Not everything is science and while it might be nice to have 5-sigma data to reinforce my opinon, it fortunately doesn't need to be so reinforced in order to be valid, or even valid to be worth sharing.
I didn't ask for 5-sigma data. I asked for data which isn't personal anecdotes and viewpoints presented as truth.
>You must be fun to work with. A CEO whose litmus test to hir a programmer isn't programming finesse or cultural fit or drive..., but how good is his grammar.
Love the typo in your last line. Love it even more if it was unintentionl.
There are far too many typos and grammar errors in that post for its writer to be playing around. I think someone is bitter about their poor grasp of grammar constructs.
"Apart from my words, I have my intent, thoughts, opinions, facts. Words are a medium." But in written communication all those other things must be conveyed by your words. That's what writing is.
> But in written communication all those other things must be conveyed by your words.
> That's what writing is.
So if I writing is just words, my grammatically correct words are as good as Orwell's, right? Writing isn't just about words - words are but a minor detail. Why did anyone read "A Clockwork Orange" when it butchered words in some very creative ways?
Do you have anything interesting to say, are you bringing anything new to the table, are you reporting facts, is this an opinion piece, and a thousand other things are what makes writing a powerful medium. How to use correct grammar is a very small variable in the scheme of things.
Right. You should know the rules so you know when to break them. Stephen King hits on this point well in "On Writing." That book is a great read and takes a very non-parochial approach to writing.
I'm Dyslexic and if I was applying for a job (obviously not to be a writer) and was told that there was a grammar test but I didn't need to take it, or it didn't count, because I am Dyslexic I would probably walk away. How do I know that they aren't going to hold my Dyslexia against me just as they would someone who has bad grammar? What's the difference between bad grammar and Dyslexia (obviously I know the difference ) the person you tested may have never been diagnosed with Dyslexia.
We will probably never know if anyone is really holding something against us. Sure, someone could tell you but others might choose not to disclose that information. We just go with trust and gut :)
Dear Mr. von Neumann:
With the greatest sorrow I have learned of your
illness.
The news came to me as quite unexpected. Morgenstern
already last summer told me of a bout of weakness you
once had
I am pretty sure that "already last summer" is poor English grammar. Tisk - this Godel character. Probably not worth a face to face.
I am buying a home in need of some repairs, and my real estate agent recommended a contractor whom he has personally used a lot. The contractor inspected the home and prepared a competitively priced five-figure estimate.
From his estimate: "These conditions, if left to deteriate [sic] further, will make the building unsafe. ... The following recommended work is described by catagory [sic]: ... All leaks, disfunctional [sic] faucets & fixtures will be repaired or replaced."
It goes on. I'm shocked a professional would write something like this, especially since it is a MS Word document and so all the misspelled words are conveniently underlined in red.
But my real estate agent thinks I'm making a big deal out of nothing. Anyone have thoughts, experience, opinions, advice?
> But my real estate agent thinks I'm making a big deal out of nothing. Anyone have thoughts, experience, opinions, advice?
If you (trust|know|have worked with) your real estate agent, and he is vouching for the bag grammar and spelling guy, how does the bad grammar and spelling matter?
Hire him, but look out for lack of attention to detail.
In my last job they talked a lot about the difference between being "done" and being "done done". The latter requires significant attention to detail, and especially a lack of "it's someones else's problem"-attitude.
While the substance of the work would likely be done satisfactory and on time, wouldn't you want him to do the equivalent of spending 30 seconds reading the letter over once, clicking on the red squiggly lines? If those 30 seconds are not invested when writing an estimate, is he going to get spend 30 minutes re-fitting the foo to the bar, so it's perfect rather than just "done"?
Here is where I part from the OP. I do think of programming as a form of writing, and so I think it's OK to judge a programmer thusly, but I don't think it's got much to do with stocking shelves or plumbing or carpentry.
Moreover, based on your snippets that contractor has trouble with spelling errors, not grammar, which in my opinion is a different sort of skill altogether. Also I would not be surprised if he/she is dyslexic or similar.
Funny, I prefer misspelled to incorrectly corrected. Nothing worse than having a document littered with poor word substitutions -- at least with a spelling error I could know a bit more what the writer intended.
Speaking of home purchases, if I were to do it again, I'd get two home inspections. My primary inspector for my current house completely missed very expensive (and relatively standard/easy to check) masonry and electrical deficiencies that were not up-to-code. His report was meticulous... perhaps because it wasn't prepared by him, but by an intermediary in the same office?
On this topic, I think that being able to communicate clearly is an essential skill. Poor grammar hurts, although not anywhere as bad as completely poor analysis. However, if you're going to rank someone on grammar, you should also rank them on their visual/spacial diagramming skills. A good diagram can communicate much more effectively than text. I'd rather have someone who is good at both language and diagrams than someone who is an excellent writer but unable to illustrate visually.
The one time I hired a home inspector, I got the report by the end of the 3-hour appointment, printed up by his laptop and portable printer. It wasn't until I read your comment that I now know how what a good thing that was.
I think linguistic skills of someone who saws pieces of wood for a living are less relevant than for someone who conjures up high abstract concepts and communicates them to man and machine alike.
I think you're making a big deal out of nothing. You're hiring this guy because of his proficiency in using his tools: saws, hammers, wood, nails, construction knowledge, etc. It seems silly then to dismiss him because of his lack of proficiency in tertiary tools: words, word processing software, etc.
I wouldn't hire the contractor to work as an editor. But I would read the estimate to see whether it makes sense--has he identified all the true problems? Is he proposing to fix things that aren't broken? Is he proposing to use materials and fixtures of good quality? If I were satisfied with the answers to these questions, I would not worry about the writing.
I like how he ended a sentence with "with", and made that "with" a link to a note debunking the popular but silly belief that you should not end a sentence with a preposition. That was both clever and shows that he knows what he's talking about.
I wish I had a personal grammar nazi who'd poke me everytime I make a mistake in English. Unfortunately, I don't have such an individual, so I'm left with myself and my doubts.
I have to balance between near-perfecting English as my second language, or learning and concentrating on stuff that is relevant to my domain. Although I am a perfectionist, I also am a realist and consider that there are, in the end, things more important to my life than writing perfect English. Perfect English will not help me building all the great projects I have in mind.
I freaked out at first. I couldn't believe that they were criticizing developers for poor grammar. As I read on, I realized that they make a living on the written word and it all made sense.
This isn't a test for suitability for the position. It's a matter of signalling.
This is a cheap test (surely can't take more than 5 minutes to complete a grammar test), but with the property that the only candidates who will -fail- it are candidates unsuitable for hiring. Since the author gives no statistics, we can only assume that enough people are weeded out to make it worthwhile.
It's much the same as a FizzBuzz test. I certainly don't feel offended if I'm given such a test at an interview, I just smile and do it.
"And just like good writing and good grammar, when it comes to programming, the devil's in the details."
A grouchy pedant might remark that this parses as "When good writing and good grammar come to programming, they are in the details; so also is the devil when it comes to programming."
I would not hire somebody who writes badly to be a technical writer, or to do any worth that requires a lot of writing. However, I have worked with a number of persons who did not write well but were very effective at complicated work.
These days programmers don't want to be bothered about the language syntax they use and heavily depend of autocomplete and intellisense to do even simple tasks. While its this grammar that they should be actually learning and mastering. Yet this isn't even the criteria for hiring programmers these days.
And by the way where does this stop. I can argue physical fitness is important for programmer so can I ask you to sprint a 1000m track as a part of my interview process?
I interviewed at iFixit, and I took the grammar test.
iFixit is a great company with a solid mission, and a reasonable grammar test considering the nature of the company. It was a local radio interview a few years ago featuring Mr. Wiens which reignited my ambition to move out of the manufacturing/shipping sector by learning to code. I am still learning to code, and still have a long way to go before I'm ready to earn a paycheck doing so. However, I applied for a position I was well qualified for in the ifixit shipping department in hopes of further immersing myself in the 'culture'. I traveled six hours by train for the interview. I was left scratching my head when only a few minutes of the lengthy interview centered on my relevant experience, or the requirements of the shipping position. A far larger segment of time was spent on grammar tests and logic puzzles (and no, they were not sku, or 'shipping' themed puzzles).
I think the key to success with novel interview tests is to make sure you are filtering for the correct result. You don't want to inadvertently filter out highly motivated individuals.
Overall, the interview was a good experience, and probably results in the hiring of great gearheads & coders.
Teams that lack the element of craftsmanship in their culture will likely not care if a programmer knows grammatical rules of natural language (much less if he is a master of natural language). To simplify, whether you agree with the author correlates with what type of programmer you are (or want to hire). In this simplification, I'll call them "productive programmers" and "craftsmen programmers".
Productive programmers ship lots code, create lots of value, and by business standards are model programmers. They're driven by quantity, volume, getting things out the door, and the solvency of the business.
Craftsmen programmers also "produce" and ship code, but they equally value maintainability, and therefore clarity in code. They're driven by the customer as much as the other consumer of their code, the maintainers.
From my experience, these very different types of programmers have very different priorities and values. Generally, the first set emphasizes clarity in communication only as it impedes progress along the critical path. They JIT there eloquence, whether it's essential business communication or unstable/volatile/certain-to-change code. JITing is hard and not guaranteed to produce the optimal (most clear) output. The second group feels obligated to be clear in communication at all time, because any corners cut now may create pain for a fellow down the road. To excel at this, the second group is self-motivated to learn deeply both the spirit and the mechanics of communication (empathy and grammar). They spend more time analyzing the available solutions for potential misinterpretations, and therefore generally produce a clearer output.
Thus, the applicability of the article depends on the culture and environment of the team.
" I've found that people who make fewer mistakes on a grammar test also make fewer mistakes when they are doing something completely unrelated to writing — like stocking shelves or labeling parts."
I take issue with this.
Assuming you could, assuming they were available to work because they couldn't find any other job, would you hire a Rhodes Scholar?
If you are hiring for a blue collar type position like "stocking shelves" you want someone who is qualified enough and happy to be in that job and planning to stay at that job. Not thinking it's an interim job until they find something better. A person who is academic enough to have perfect english many times will be an under achiever who might have issues and that is why they are only "stocking shelves".
There is a reason why companies often say "you are over qualified for the job" as a reason someone isn't hired. The job has to fit the person roughly.
As an aside the OP didn't exactly define what is meant by "poor grammar" or place any links to the "mandatory grammar test" so readers can even decide for themselves.
If they're good at what they do and not an imbecile then couldn't you give them a lesson or two in grammar. I was never taught English grammar at school - French and Russian grammar but no English grammar beyond learning a poem about it at primary school. I was educated in England, English is my first language.
How can you never be taught grammar? There are two English gcses, English language and English Lit. One is about the technicalities of the language, the other about literature. Everyone has to do English Language....
There are indeed two GCSEs but crucially we had only one type of English lesson. We studied literature, poetry, wrote stories and poems of our own; did presentations, read plays, memorised Shakespearean monologues and such. But, barring that one poem in primary school I was never taught what an adverb is for example.
In French we'd look at past-participles and different tenses but never was it discussed what the pluperfect or future perfect was in English. In Russian we looked at locative and genitive, accusative and nominative cases (and others I'm sure) but in English there was never once a mention that anything such as a grammatical case existed. In school there was never a lesson on the apostrophe - reading Truss's tome [Eats, Shoots & Leaves] recently made me wonder why on Earth I couldn't have been passed something like that as a kid (I was a quite avid reader for many years but alas of course it wasn't written until 2006).
We did look at literary terms like onomatopoeia, alliteration, spoonerism and related concepts - metaphor, rhyming and timbre - that allow for analysis of poetry and prose.
Our English language classes, and exam incidentally, were about English usage and not really the language itself - the construction of language using English as the subject.
Grammar was just not in vogue at the time I feel. I've always felt however that being taught English grammar would have helped foreign language learning immeasurably.
FWIW I got high marks in both English exams (though I felt I was robbed by my teacher confusing me with the boy I sat next to!).
Personally, I've found my grammar degrades after a day of coding. I can write well formed sentences in general, but by the end of the day my mind is working in a much different day and my emails drift towards an embarrassing place.
I don't think there's anything wrong with having your own particular standards for hiring. If it works for you in solving your particular problem: great! One of the things I began to appreciate when I first starting interviewing in the Valley is how much companies did or didn't care about nuance in their interviews. If the interview was too easy, it generally signaled to me that they don't sweat the details and maybe don't hire people that are going to stretch me professionally.
I see where the OP is coming from but this is the kind of mindset that could ensure you miss out on brilliant technical hires. I have a co-worker who only started learning English 15 years ago ... he speaks very good English, but every now and then he'll make written grammatical errors and spelling mistakes. Brilliant developer.
I'm a bit OCD about grammar myself and am definitely partial to people who are excellent writers and speakers of English, but I think making an allowance for non-native English speakers could
a. help increase the diversity of people and ideas in your workplace
b. make sure you don't miss out otherwise fantastic hires
It's funny , I am a native french speaker married to an australian so my english start to be quit all right. What is funny is that "there, their, they're" or "its, it's" mistakes are painful to me, but at the same time I am unable to write 3 sentences in french whith no mistake. I probably made a few mistake in this post but remember english is EASY (at least from a european background). I am ashamed of my own french writting , I tried (not to hard) to fix it with very limited success , when I what kind of easy stupid common English mistake I'm really torn between empathy and disgusts.
Now, Truss and I disagree on what it means to have "zero tolerance." She thinks that people who mix up their itses "deserve to be struck by lightning, hacked up on the spot and buried in an unmarked grave," while I just think they deserve to be passed over for a job — even if they are otherwise qualified for the position.
Putting comma inside quotation like the above could make grammar nazis happy, but when your newly hired programmer treats all algebraic operations as commutative, it will do no good to your company. Code is not a prose, and coders are not writers.
I've never understood this putting punctuation that isn't part of a quote inside the quotation marks. It seems both wrong in terms of verity and in terms of logically constructing prose.
Can someone explain a rationale for this proclivity ?
So there's no rationale beyond convention? Other linguistic conventions get changed so why is this one sticking around and why do its supporters promote it - is the more logical [to me] method somehow confusing?
it's old printers' stuff to do with physical properties of lead type, i think. as to why it won't go away, i have no idea. i happily ignore it in almost all my writing....
I'm not sure you can say it's typographic. One still uses quotations in handwritten prose where there is no arrangement of "type" per se. It seems syntactical to me but perhaps is best categorised as orthography?
"And, for better or worse, people judge you if you can't tell the difference between their, there, and they're."
I used to make that judgment. I've since been humbled by many brilliant people who commit that and other common errors. I would be poorer, possibly impoverished, if I had dismissed those people from my circle.
As Peopleware pointed out many years ago, most people have something important to contribute, even if they don't contribute much in one specific area.
On the flip side, I'm suspicious of people who immediately comment on typos and grammatical mistakes when they read something. Usually I don't even register such mistakes, my brain must just automatically make sense of the text. People who notice small mistakes that don't hamper the actual meaning/communication at hand - I wonder if they are really thinking about the meaning much at all, or just focussing on superficial details.
If he's talking about general grammar skills in any language then it makes sense. Poor grammar always stands out as a negative trait, but should not be the sole criteria to hire. Some of the best developers are immigrants and not always have the best command over the native language.
What if iFixit/Dozuki decides to expand into Germany or Japan and the OP fails on the basic level of grammar there? Will he be consider himself a sub par resource?
The comments have become a firestorm of grammar-as-a-filter for programmers. While this is an interesting debate, I think there are three things about this filter that give it value:
1) Grammar serves as a cultural touchstone in the company, so using a grammar test in hiring is a strong signal to employees and future employees about what we as a company stand for. If you apply to this job and you disregard grammar freely (guilty as charged) you will not fit into this organization. It's a fast filter on both ends: I won't take the test, and if I did you would reject me immediately. Excellent.
2) It is a binary, non-complex test. You write the test once and there is a unambiguous right and wrong answer, if you get the questions right you pass, if you don't you are rejected. This has several benefits:
Applicants: They know this is coming and can prepare/not prepare for it. The test is objective, so they can't really argue with its validity. At one point they probably knew this material, meaning that an hour of time to prepare for the finer points of colons/semi-colons is probably doable if they really want the job.
Employer:
Since the answers are unambiguous this filter is easy to use: passed candidates go through. There is no subjectivity around assessing a candidate using a resume or cover letter. In my experience (CEO in hiring efficiency space) this easily-actionable filter means that the task at hand will actually get done. If you watch recruiters try to parse through 200+ resumes against a job req, they will stop after 10-15. They might come back to it on another day, or they might not. Either way the applicant pipeline stops dead on their desk. It's frustrating for the recruiter because the task becomes "analyze this free-text against a free-text requisition and then filter this list of 200 people down to 20."
The reality of this situation is that most resumes don't get read and the person who ends up getting the job is one of the first 20 that were read. Obviously this situation is non-ideal, so I generally advocate an objective, simple filter as the first step to any process (before looking at resume.) Internal recruiters, in their heart of hearts, want to find the best applicant in the bunch, if you don't give them the tools to do their job then they won't.
3) They've thought about their process and institutionalized it. When I see a company that has thought seriously about the filter stage of the process then I know: a) they care about their employees' time b) they care about quality applicant's and the culture they project in the hiring process c) they have probably thought about all stages of the hiring process so things are likely to move quickly and smoothly.
My mother thong is Spanish, I also know English, Portuguese and some Italian. I'm not a good in either language. In the last few years I put a lot of time in getting better at English. By reading that post and some comments you will assume I don't read, I'm lazy, slow learner, sloppy and unmotivated. Proper grammar is important, but I'll not make it a priority while looking for talent.
no, just illiterate. the original author and the commenters here all made allowances for non-native speakers of english, which apparently you couldn't be bothered to read....
This kind of outlook and having a grammar test doesn't make me think that you'd be a great boss to work with, nor does it make me want to work for you. I may not be the cultural fit you're looking for, but undoubtedly you're shrinking your potential talent pool and removing exceptional candidates by just sounding like a stiff and non-flexible place to work, as a programmer.
As a foreigner, I can try to answer that. I've noticed that in general foreigners have a much better grammar than English native speakers because we (at least in my case) went through boring English classes. Learning English was a conscious effort and every grammar rule had to be learned and understood.
Reading English also requires a conscious effort, and reading poorly written English requires an even greater effort to parse the bad and incorrect clauses, something a native English speaker doesn't have to do.
> I've noticed that in general foreigners have a much better grammar than English native speakers
I have to disagree strongly - not my experience at all, with Indians, Chinese, Germans, Spaniards, Belorussians.. One exception, a Greek who could kick my verbal ass. But, of course, native speakers have the huge advantage of proper use being just intuitive but, still, you made an absolute statement.
You may be interested to know that when a conquered people adopts the conquerors' language, they take on the vocabulary but much of the grammar of the old language remains. I read an interesting paper about some grammatical trick you could do in Germanic languages, but not in English. Why? Maybe because, in this way, Celtic was shining through? No, non-British-aisles Celtic has the same trick. The paper speculated that it was a remnant of the grammar of pre-Celtic, pre-historic Briton. (Just to illustrate how hard it is to adopt a new grammar.)
Also, note, general foreigner, it's 'much better grammar' not '/a/ much better grammar'.
As an interviewee, the grammar test reinforced my confidence that iFixit was the right fit for me. I don't want to work with leadership that doesn't "put their money where their mouth is." I love that Kyle is uncompromising in this regard, as it sets the bar high, and we all continue to strive for excellence.
When I screen, interview and hire people, I look for reasons to hire someone as opposed to trying to find reasons to disqualify people. This kind of negative approach (especially when dealing with people) is just not a good way to go about things in my opinion and experience.
I'm sure you do both. Why would you ignore negatives?
Like "this guys skillset is great so were going to ignore that he was fired from his last job for laziness and that all his department left because of his sociopathic behaviour". Seriously?
Nice drama title. Of course the drama tanks when the reader finds out that it's for a job where people write for a living. That's like asking a programmer to do some coding in the interview. Which is, IMO, not weird at all and rather something everyone should consider.
Yes, technical writers are required to use proper grammar. Why is that controversial? I do find it ironic that his prose has commas in places where most people would not put a comma, and his placement is probably correct, but still somewhat less readable.
You could write a similar article about why you only hire people who eat with their elbows off the table, and it would probably be just as meaningful. I agree that the basics are important, but I disagree that it matters to the extent you're suggesting.
My current grammar peeve is when people end a question with "... or no?" "Are you coming over tonight, or no?" "Do you like this outfit, or no?" "Do I sound like a teenager, or no?" What's up with that?!
Do any of you find yourselves talking like that, or no?
Grammar is a set of agreed rules, therefore it is not set in stone. Someone might disagree with the rules, and that leads to a slightly different grammar. And so on and so forth. Effectively, everyone has their 'own' grammar that they use. IMHO (especially not being a native English speaker) this article is rubbish (it's in URL BTW). One needs to know his grammar to write articles, but for a developer position? Don't think so.
Oh please. I can understand not hiring someone who communicates poorly. That makes sense. But grammar is just a set of arbitrary rules that some person or group of people decided were correct. It's like a secret handshake that lets people know you're part of some smart boys club (by and large it seems like men are the only ones who enjoy bitching about grammar). To me, it says, "I went to a good school system and was taught correctly!"
Sure, I get a little anxious when people misuse your/you're. But to think it's an early indicator of success is at best limiting yourself to missing out on talent, and at worse surrounding yourself with likeminded pedantic grammar nazis who care about unimportant things. In my time as a software developer, I've met a lot of people who got angsty about grammar who weren't very inspiring programmers. I've met more people who couldn't figure out their and there or you're and your to save their lives who were both fantastically good software engineers and effective communicators.
Lets keep a little perspective here: The goal of hiring isn't to find someone who's not going to annoy you by misusing grammar in emails. It's to find someone who's going to help you accomplish your goals. I get that details are important for your company. I just don't agree that knowing grammar rules indicates an attention to the proper details.
> But grammar is just a set of arbitrary rules that some person or group of people decided were correct.
That's like saying the evolutionary tree of life is just a set of arbitrary rules someone came up with. It's not, it's a description of an existing evolved system.
Grammar wasn't handed down from on high by an authority, it's an evolved, undirected cultural artifact. There were definitely influential actors, like the (IIRC) monks who first developed the capitals, lowercase, and italics we now use without a second thought, or Noah Webster's personal preferences for spelling which he published and we now largely accept.
> To me, it says, "I went to a good school system and was taught correctly!"
I'm not sure. I guess it depends on the test - if it's asking "which is the past participle form of 'run'", then I'd agree, that's just memorization. But if it's simply asking "which sentence is more correct" and one of the sentences has an improperly conjugated verb, that's not something one even needs to be formally taught if they've read much.
If you can't string a sentence together such that it can be read unambiguously, how can I expect your code to be readable?
> I've met more people who couldn't figure out their and there or you're and your to save their lives who were both fantastically good software engineers and effective communicators.
I'd be willing to bet if you laid out two identical sentences, one with "your" and one with "you're", these people you speak of would be able to choose the correct one. They may not always use the correct one in the quick, ad-hoc process of writing a sentence, but I'm sure they know right from wrong if asked to scrutinize.
Natural selection was not a conscious choice. There have been many choices in grammar that have been decided almost arbitrarily by higher authorities. Sure, many of these choices may have been based on their own experiences, but they were not necessarily the best choices. They were decided by a small group of individuals, but again, the choice was not strictly a natural selection of grammar rules. Language more often follows this pattern than grammar because people are more likely to argue over grammar than language itself, excluding etymology, of course.
> There have been many choices in grammar that have been decided almost arbitrarily by higher authorities
Just because somebody makes a decree about grammar doesn't mean it'll be widely accepted, any more than an organism obtaining a genetic mutation means it'll become spread throughout the population in subsequent generations.
My point is just that there wasn't any overarching plan. People design their own use of it, but whether or not that becomes popular enough to be considered "the rule" is analogous to natural selection - messy, convoluted, frequently arbitrary, but good enough.
> it's a description of an existing evolved system.
This is not how the people who get angriest about grammar see it. They demand that splitting an infinitive is wrong, they demand that the passive voice is wrong (and those demands get stronger the more often they themselves use it), and they demand any number of other idiot things of the language, because they don't know the difference between grammar and style, and they fail to realize that dictating style requires taste, which they lack entirely.
> "which sentence is more correct" and one of the sentences has an improperly conjugated verb
Hm. Don't give this test to people who speak AAVE [1]: You'll be considered racist.
[1] AAVE is 'African-American Vernacular English', or what the conservative yammer-heads called 'Ebonics' back in the 1990s. It has a rich inventory of verb conjugations considered incorrect by standard English.
> Don't give this test to people who speak AAVE: You'll be considered racist.
Perhaps, but probably not. In my understanding most AAVE speakers (and other "low-class" dialects, for that matter) are bidialectical with Standard American English and can code-switch to it in more formal situations. And, those who aren't able to do so likely haven't been exposed to the education and training necessary to even be qualified for a job like this in the first place - they surely would've picked it up along the way.
>it says "I went to a good school system and was taught correctly!"
I cant imagine how people can manage to make it past 25 without learning the majority of the rules simply through diffusion; eventually being exposed too enough correct righting that they know the rules without even realizing they know them.
There are subtle rules that you might see but not understand which aren't obtainable through such a method, and those are ones that Im willing to forgive. I'm sure I make some of them my self.
Irregardless errors in fundamental and extremely obvious rules such as this post say too me I have never consumed my first language in writing willfully and may possibly have zero attention to detail.
Tell me you didn't cringe - then tell me that the clients, colleagues or superiors of our prospective hiree won't.
A guy on my team uses "undue" instead of "undo" in commit messages. He's a nice guy and as much as I'd like to, I can't repair the damage that does to my image of him; "do", "due" and how "un-" works are first grade material at the latest. I simply cannot understand such a mistake.
"Oh please. I can understand not hiring someone who communicates poorly. That makes sense. But grammar is just a set of arbitrary rules that some person or group of people decided were correct. It's like a secret handshake that lets people know you're part of some smart boys club."
Sounds like every programming language syntax ever.
In general job X, that may be true. But when you're being hired at a company where your job is going to explicitly be writing clear technical prose, I don't think it's unimportant at all.
Grammar is probably more important for sales than it is for developers. I get a few sales pitches and tend to dump the ones with poor grammar. I feel like: if they can't bother to send a clean pitch, how well are they going to manage my account?
The problem is that most people who rail against poor grammar haven't the slightest clue what grammar actually entails, and are instead peeving on stylistic issues that vary both due to the region the applicant hails from and how old the applicant is. In a lot of cases, they may as well be basing a hiring decision on whether the applicant says 'bubbler' instead of 'water fountain' or 'pop' instead of 'soda'.
For me this comes down to the HR dilemma and why I feel sympathy for the dreaded HR drones and in this case for him having to make HR decisions:
They are people responsible for making decisions about applicants they practically know nothing about based on nothing but a bit of self-presentation and ultimately they are in trouble when that applicant turns out to have been the wrong choice, so no wonder they turn to obscure "voodoo" and "dark magic" as sure-fire ways of weeding out allegedly "bad" applicants... in this case instead of a crystal ball he uses a grammar test to make decisions about, amongst others, staffing technical and engineering positions. I am sure a lot of excellent programmers and engineers aren't necessarily the best with words and you have no idea how well they might do by requiring them to pass a grammar test as the bare minimum. You might just as well ask them to paint you a few pictures and then derive conclusions about their mental abilities to picture things and software framework... while there could be correlations, it just has nothing to do with their job and the rest of their skills.
Here is an idea for hiring people and to end this strange fascination that comes with it: building on a core team of excellent people and a good, existing culture you then bring in people mainly through references. I saw this work especially well when bringing in students but it applies to former colleagues as well; they know who the good and the go-to people amongst their colleagues are/were and the good people know it more than anyone else. And they also know the person from working with them so chances are the applicant will be a great fit for the existing team. As an alternative you could let your people teach some classes or give trainings and they will very quickly know who the good students are. Then just hire the applicant and give them a realistic chance to prove themselves in a project. Chances are very good that within a few months or even weeks it will become very clear whether they are a good fit or not. Bad people WILL dis-qualify and alienate themselves, in any healthy organization you can trust on that and if you don't have any good people at all well then your whole hiring-voodoo is pointless anyway because the best new hire will drown in your swamp.
I have seen this work like a charm at my last employer - I was brought in by my Linux teacher who was working there and I went for a beer with the guys, I liked the atmosphere and the people and they could get to know me too and then I showed up to the interview with the CEO in a very relaxed environment and it was pretty clear they want me so we just discussed details and that was it. Efforts on their part were zero and on my part a mild hang-over after a great evening and no stupid self-presentation and applying shenanigans. I have seen people come and go at that place for a few years and without a fail the ones that left were a bad match or just bad employees. The good ones stuck around and I am best friends with all of them, even to this day more than three years after the place went belly-up and I moved to a different country. At that time each one of them found a new job very quickly and a lot of former customers continued to bring them business just because the people were so good and reliable. Each time we get together twice a year now that "old spirit" flares up again and we feel just like in the "good old days". This is also the best network you could ask for, it "grew naturally", if you will.
Hire through references and give people a realistic chance and be smart about managing your expenses, side-effects and repercussions from doing so. I can not see how this is not way more efficient and more reliable than the whole bloated fortune-telling HR mumbo-jumbo.
Some advice - just because you happen to like perfect grammar, don't enforce your preference on others. Some of the best devs I've ever worked with have poor grammar, but it has never stopped them from making huge contributions to the team.
People Who Use Poor Grammar should be: People Who Use Grammar Poorly.
I have a "zero tolerance approach" should be: I have a zero-tolerance approach.
people who mix up their itses should be: people who mix up its and it's.
passed over for a job — even if should be: passed over for a job—even if.
I don't point this out to be pedantic or to level at the author a tu quoque, but to point out that the idea of an "English grammar" that you can apply universally to writing is a myth. At best you can create a style guide, or follow an existing one. Expecting employees to be able to follow a known style guide is a reasonable request. Expecting them to score perfectly on a grammar test against a style guide they've never seen, without (presumably) computer assistance, when scored by someone who believes that they're capable of scoring a grammar test without recourse to existing style guides, is foolish, in exactly the same way that expecting a programmer to remember whether String.find take (needle, haystack) or (haystack, needle) as its arguments is a terrible interview technique.