Better calibration when predicting from experience (rather than description)

Publication Type:
Journal Article
Organizational Behavior and Human Decision Processes, 2019, 150 pp. 62 - 82
Issue Date:
Full metadata record
© 2018 Elsevier Inc. The over-precision bias refers to the tendency for individuals to believe that their predictions are much more accurate than they really are. We investigated whether this type of overconfidence is moderated by how task-relevant information is obtained. We contrast cases in which individuals were presented with information about two options with equal average performance – one with low variance the other with high variance – in experience format (i.e., observed individual performance outcomes sequentially) or description format (i.e., presented with a summary of the outcome distribution). Across three experiments, we found that those learning from description tended to be over-precise whereas those learning from experience were under-precise. These differences were driven by a relatively better calibrated representation of the underlying outcome distribution by those presented with experience-based information. We argue that those presented with experience-based information have better learning due to more opportunities for prediction-error.
Please use this identifier to cite or link to this item: