The 2016-2019 VCE Computing study design, U1O3 KK05, wants students to know “factors affecting the integrity of data, such as correctness, reasonableness and accuracy” – which is fine with me…
Until you start to ponder. What is the difference between “correctness” and “accuracy”?
The study design’s glossary does not help.
The Concise Oxford English Dictionary (11th edition) helpfully says:
- Correct (adjective) – (1) free from error; true or right. (2) conforming to accepted social standards.
- Accurate (adjective) – (1) correct in all details. (2) capable of or successful in reaching the intended target.
I don’t think the COED is very helpful.
So, what are students supposed to know about the difference between the terms in the key knowledge? Here is how I (a former English teacher) see it:
“Correct” (adjective) is a superlative. Something is either correct or it’s not. There are no degrees of correctness. Something is right, or it is not right. You cannot be partially correct.
“Accurate” refers to the degree to which the truth is represented in a statement or calculation.
The ISO (ISO 5725-1) defines ‘accuracy’ as results that are both true (correct) and consistent (without random errors). Accurate observations are close to the true, actual value they claim to represent.
Examples (as I see it):
- Correct, but not very accurate: France is a country in the northern hemisphere.
- Correct, but more accurate: France is a country in Europe.
- Incorrect: France is a country in Africa.
- Correct but not very accurate: Pi = 3.14
- Correct and more accurate: Pi = 3.14159
- Incorrect: Pi = 8.08495869456
My interpretation implies that a statement or observation can be correct but not very accurate; it cannot be accurate but not correct. To be in any way accurate, it must first be correct.
Challenge: Can anyone suggest an example where something is incorrect, but accurate?
A related term “precision” refers to the consistency and ‘repeatability’ of multiple observations, regardless of how closely they represent the truth. Precise results may be grouped closely together in value, but still be far from the objectively true and accurate value.
(Informatics students might want to interpret this as “the results have a low standard deviation.”)
Bathroom scales that show a person’s weight as being 70kg plus or minus 50 grams every day for a week would be precise because the variation in measurements is small. But if the person actually weighs 90kg, the scales would be inaccurate.
Another view of ‘precision’ in common usage refers to the amount of resolution in a measurement, such as the number of decimal places recorded.
So, bathroom scales might be:
- precise but not accurate: the scales consistently show similar values that are far from the true weight.
- accurate but not precise: the readings are closer to 90kg but no two readings are very much the same.
- both precise and accurate: the readings are always very close to 90kg. Such accurate and precise readings may also be classed as “valid”.
- neither precise nor accurate: the readings vary widely, and none of them is close to 90kg.
Although casual use often uses ‘accuracy’ and ‘precision’ interchangeably, they are very different in strict scientific usage. It’s like the way VCE IT uses the terms ‘data’ and ‘information’, or ‘accessibility’ in a strict sense.