If someone gets an answer wrong, that’s one thing. If someone gets an answer wrong and they expressed a “high degree of certainty in their answer,” that’s even worse, because it shows a lack of self-awareness about one’s own knowledge.
This is particularly relevant in the fascinating world of experts. Experts make all sorts of proclamations and predictions. Unfortunately, most experts and specialists are no better at making predictions than an amateur. What’s even more unfortunate is that experts do not seem to accurately express a degree of certainty about their prediction — in other words, they are equally confident in their bullshit predictions as they are the accurate ones.
Wouldn’t it be interesting, on exams in schools, to have a “certainty” scale next to each answer, to reward when students appear to know what they don’t know and punish students who take a wild ass guess and get it right?
Are there other applications for a “certainty” question?
(hat tip to Jared Polis for helping illuminate this idea)