Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma–Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. © 2018 Cognitive Science Society, Inc.

Generalized information theory meets human cognition: introducing a unified framework to model uncertainty and information search / Crupi, Vincenzo; Nelson, Jonathan D.; Meder, Bjorn; Cevolani, Gustavo; Tentori, Katya. - In: COGNITIVE SCIENCE. - ISSN 0364-0213. - 2018,42:5(2018), pp. 1410-1456. [10.1111/cogs.12613]

Generalized information theory meets human cognition: introducing a unified framework to model uncertainty and information search

Vincenzo Crupi;Katya Tentori
2018-01-01

Abstract

Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma–Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. © 2018 Cognitive Science Society, Inc.
2018
5
Crupi, Vincenzo; Nelson, Jonathan D.; Meder, Bjorn; Cevolani, Gustavo; Tentori, Katya
Generalized information theory meets human cognition: introducing a unified framework to model uncertainty and information search / Crupi, Vincenzo; Nelson, Jonathan D.; Meder, Bjorn; Cevolani, Gustavo; Tentori, Katya. - In: COGNITIVE SCIENCE. - ISSN 0364-0213. - 2018,42:5(2018), pp. 1410-1456. [10.1111/cogs.12613]
File in questo prodotto:
File Dimensione Formato  
Entropy_2018.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.22 MB
Formato Adobe PDF
1.22 MB Adobe PDF   Visualizza/Apri
157868176.pdf

accesso aperto

Tipologia: Post-print referato (Refereed author’s manuscript)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.47 MB
Formato Adobe PDF
2.47 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/223250
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 58
  • ???jsp.display-item.citation.isi??? 49
  • OpenAlex ND
social impact