Speaker: Drazen Prelec

Drazen Prelec
November 30, 2018
3:30PM - 5:00PM
Ramseyer Hall. Room 100

Date Range
2018-11-30 15:30:00 2018-11-30 17:00:00 Speaker: Drazen Prelec Drazen Prelec (MIT) will deliver a talk entitled, "Finding truth when most people are wrong" (cosponsored by the DSC). See abstract below.     Abstract: The question whether to trust the judgments of a few experts or the wisdom-of-the-crowd is not just of scientific but also of political and philosophical interest.  Crowd wisdom is usually defined as consensus — the majority vote or the median estimate or forecast. This principle seems fair and simple, but it has a blind spot for information that is new or unfamiliar. The cost of wrong collective decisions can be high in terms of environmental risks underestimated, or promising ideas ignored. The challenge is to combine the virtues of a ‘democratic’ procedure, which allows anyone, irrespective of credentials, to register an opinion, with an 'elitist’ outcome that associates truth with the judgments of a select few. I will describe a simple alternative to democratic averaging  by a panel or online crowd. The alternative principle is to select judgments that receive more support than predicted by those same people.  I will review some recent evidence bearing on this approach, and discuss extensions to forecasting.  Ramseyer Hall. Room 100 America/New_York public

Drazen Prelec (MIT) will deliver a talk entitled, "Finding truth when most people are wrong" (cosponsored by the DSC). See abstract below.

 

 

 

 

 


Abstract: The question whether to trust the judgments of a few experts or the wisdom-of-the-crowd is not just of scientific but also of political and philosophical interest.  Crowd wisdom is usually defined as consensus — the majority vote or the median estimate or forecast. This principle seems fair and simple, but it has a blind spot for information that is new or unfamiliar. The cost of wrong collective decisions can be high in terms of environmental risks underestimated, or promising ideas ignored. The challenge is to combine the virtues of a ‘democratic’ procedure, which allows anyone, irrespective of credentials, to register an opinion, with an 'elitist’ outcome that associates truth with the judgments of a select few. I will describe a simple alternative to democratic averaging  by a panel or online crowd. The alternative principle is to select judgments that receive more support than predicted by those same people.  I will review some recent evidence bearing on this approach, and discuss extensions to forecasting.