A study at the University of Pittsburgh suggests spell-check software may level the playing field between people with differing levels of language skills by hampering the work of more exceptional writers and editors who place too much trust in the software.
In the study, 33 undergraduate students were asked to proofread a one-page business letter — half of them using Microsoft Word with its squiggly red and green lines underlining potential errors.
The other half did it the old-fashioned way, using only their heads.
In the group that did not use grammar software, students with high SAT verbal scores clearly outperformed students with lower SAT scores.
But among those who used the software, there were more errors by both, regardless of how well the students performed on the SAT. Most surprisingly, the number of errors was far greater for students who had higher SAT verbal scores.
Dennis Galletta, a professor of information systems at the Katz Business School, said spell-checking software is so sophisticated that some people have come to trust it too thoroughly.
"It's not a software problem, it's a behavior problem," Galletta said.
Tim Pash, a technical specialist with Microsoft, said grammar and spelling technology is meant to aid writers and editors — not solve all their problems.
"At the end of the day, what you're dealing with is a mechanical tool that follows a fixed set of rules," Pash said. "The human mind and the creativity involved in free-form writing is always going to be restrained by a tool."
What that means, said Pash, is that no one should blindly accept the suggestions given by spell-check software.
In the study, the students were promised credit based upon how well they did on the test — even though, at the end, they all received the same credit.
Despite the small sample size, Paul Fishbeck, an engineering professor with Carnegie Mellon University, said the dramatic results are telling.
The study found that the software helped students find and correct errors in the letter, but in some cases they also changed phrases or sentences flagged by the software as grammatically suspicious, even though they were correct.
For instance, the letter included a passage that said, "Michael Bales would be the best candidate. Bales has proven himself in similar rolls."
The software — picking up on the last "s" in "Bales" — suggested changing the verb from "has" to "have," considering Bales plural. Meanwhile, the spell-check ignored "rolls," even though it should have been "roles."
A large number of errors not caught were words such as "rolls" used incorrectly. Other examples included "please fined attached," instead of "find" and "Web sight" instead of "Web site."
Researchers found that students using the spell-checking software tended to ignore misspelled words if the computer didn't flag them.
Without grammar or spelling software, students with higher SAT verbal scores made, on average, five errors, compared with 12.3 errors for students with lower SAT scores.
Using the software, students with higher SAT verbal scores reading the same page made, on average, 16 errors, compared with 17 errors for students with lower SAT scores.
"The experts performed just like the novices," Galletta said.
Richard Stern, a computer and electrical engineer at Carnegie Mellon University specializing in speech-recognition technology, said grammar and spelling software will never approach the complexity of the human mind.
"Write a note to Mr. Wright right now," Stern said. "One of the parlor tricks of speech recognition is a simple sentence, to us at least, that can be interpreted using statistical language modeling. Computers can decide the likelihood of correct speech, but it's a percentage game."
By Charles Sheehan