TY - JOUR
T1 - Bits of confidence
T2 - Metacognition as uncertainty reduction
AU - Fitousi, Daniel
N1 - Publisher Copyright:
© The Psychonomic Society, Inc. 2025.
PY - 2025/12
Y1 - 2025/12
N2 - How do people know when they are right? Confidence judgments – the ability to assess the correctness of one’s own decisions – are a key aspect of human metacognition. This self-evaluative act plays a central role in learning, memory, consciousness, and group decision-making. In this paper, I reframe metacognition as a structured exchange of information between stimulus, decision-maker (the actor), and confidence judge (the rater), akin to a multi-agent communication system. Within this framework, the actor aims to resolve stimulus uncertainty, while the rater seeks to infer the accuracy of the actor’s response. Applying techniques from information theory, I develop three novel measures of metacognitive efficiency: meta-U, meta-KL, and meta-J. These indices are derived from entropy and divergence principles, and quantify how effectively confidence judgments transmit information about both external stimuli and internal decisions. Simulations show that these measures possess several advantages over traditional signal detection theory metrics such as meta-d′ and the M-ratio, including more interpretable scaling, robustness to performance imbalances, and sensitivity to structural constraints. By formalizing metacognitive sensitivity as an information-processing problem, this framework offers a unified, theoretically grounded approach to studying confidence and sheds light on the sources of metacognitive inefficiency across individuals and contexts.
AB - How do people know when they are right? Confidence judgments – the ability to assess the correctness of one’s own decisions – are a key aspect of human metacognition. This self-evaluative act plays a central role in learning, memory, consciousness, and group decision-making. In this paper, I reframe metacognition as a structured exchange of information between stimulus, decision-maker (the actor), and confidence judge (the rater), akin to a multi-agent communication system. Within this framework, the actor aims to resolve stimulus uncertainty, while the rater seeks to infer the accuracy of the actor’s response. Applying techniques from information theory, I develop three novel measures of metacognitive efficiency: meta-U, meta-KL, and meta-J. These indices are derived from entropy and divergence principles, and quantify how effectively confidence judgments transmit information about both external stimuli and internal decisions. Simulations show that these measures possess several advantages over traditional signal detection theory metrics such as meta-d′ and the M-ratio, including more interpretable scaling, robustness to performance imbalances, and sensitivity to structural constraints. By formalizing metacognitive sensitivity as an information-processing problem, this framework offers a unified, theoretically grounded approach to studying confidence and sheds light on the sources of metacognitive inefficiency across individuals and contexts.
KW - Confidence
KW - Information theory
KW - M-ratio
KW - Metacognition
KW - meta-I
KW - meta-d’
UR - https://www.scopus.com/pages/publications/105015218078
U2 - 10.3758/s13423-025-02752-z
DO - 10.3758/s13423-025-02752-z
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.systematicreview???
AN - SCOPUS:105015218078
SN - 1069-9384
VL - 32
SP - 2734
EP - 2762
JO - Psychonomic Bulletin and Review
JF - Psychonomic Bulletin and Review
IS - 6
ER -