Editor’s Note: ENV is pleased to welcome P.J. Levi as a new contributor.
Knowing that ENV’s readers include many teachers and other educators, I thought I would offer a brief teaching idea, especially for those who have their students submit papers to SafeAssign or some similar plagiarism-detection software.
I just had one of my own students email me, quite frantic that SafeAssign gave him a ~30% matching score. SafeAssign compares a student’s paper to all other students’ papers and the Internet. But, he assured me, I didn’t copy anything!
A couple of lessons to be gleaned:
- There are only so many ways to “write” or “encode” a similar meaning or function. Thus some matching is expected.
- The fact is, a paper with a 30% score isn’t 30% identical to another. This is just a rough, overall estimate made by matching words and fragments of sentences, regardless of position in the text.
It seems very straightforward that one might explain this to the class and mention offhandedly: “How does that make you feel about the certainty of human/chimp genome comparisons, given that we — I take this for granted! — understand the language of DNA much less than we do English!”