Paper ID: 2407.18938

Mitigating Cognitive Biases in Multi-Criteria Crowd Assessment

Shun Ito, Hisashi Kashima

Crowdsourcing is an easy, cheap, and fast way to perform large scale quality assessment; however, human judgments are often influenced by cognitive biases, which lowers their credibility. In this study, we focus on cognitive biases associated with a multi-criteria assessment in crowdsourcing; crowdworkers who rate targets with multiple different criteria simultaneously may provide biased responses due to prominence of some criteria or global impressions of the evaluation targets. To identify and mitigate such biases, we first create evaluation datasets using crowdsourcing and investigate the effect of inter-criteria cognitive biases on crowdworker responses. Then, we propose two specific model structures for Bayesian opinion aggregation models that consider inter-criteria relations. Our experiments show that incorporating our proposed structures into the aggregation model is effective to reduce the cognitive biases and help obtain more accurate aggregation results.

Submitted: Jul 10, 2024