We are living in an era of global digital platforms, eco-systems of algorithmic processes that serve users worldwide. However, the increasing exposure to diversity online - of information and users - has led to important considerations of bias. A given platform, such as the Google search engine, may demonstrate behaviors that deviate from what users expect, or what they consider fair, relative to their own context and experiences. In this exploratory work, we put forward the notion of transparency paths, a process by which we document our position, choices, and perceptions when developing and/or using algorithmic platforms. We conducted a self-reflection exercise with seven researchers, who collected and analyzed two sets of images; one depicting an everyday activity, "washing hands,"and a second depicting the concept of "home."Participants had to document their process and choices, and in the end, compare their work to others. Finally, participants were asked to reflect on the definitions of bias and diversity. The exercise revealed the range of perspectives and approaches taken, underscoring the need for future work that will refine the transparency paths methodology.

Transparency Paths - Documenting the Diversity of User Perceptions / Giunchiglia, F.; Kleanthous, S.; Otterbacher, J.; Draws, T.. - (2021), pp. 415-420. (Intervento presentato al convegno 29th ACM Conference on User Modeling, Adaptation and Personalization, UMAP 2021 tenutosi a Utrecht Netherlands Online nel 21 - 25 June 2021) [10.1145/3450614.3463292].

Transparency Paths - Documenting the Diversity of User Perceptions

Giunchiglia F.;
2021-01-01

Abstract

We are living in an era of global digital platforms, eco-systems of algorithmic processes that serve users worldwide. However, the increasing exposure to diversity online - of information and users - has led to important considerations of bias. A given platform, such as the Google search engine, may demonstrate behaviors that deviate from what users expect, or what they consider fair, relative to their own context and experiences. In this exploratory work, we put forward the notion of transparency paths, a process by which we document our position, choices, and perceptions when developing and/or using algorithmic platforms. We conducted a self-reflection exercise with seven researchers, who collected and analyzed two sets of images; one depicting an everyday activity, "washing hands,"and a second depicting the concept of "home."Participants had to document their process and choices, and in the end, compare their work to others. Finally, participants were asked to reflect on the definitions of bias and diversity. The exercise revealed the range of perspectives and approaches taken, underscoring the need for future work that will refine the transparency paths methodology.
2021
UMAP 2021 - Adjunct Publication of the 29th ACM Conference on User Modeling, Adaptation and Personalization
Online Proceedings
Association for Computing Machinery, Inc
9781450383677
Giunchiglia, F.; Kleanthous, S.; Otterbacher, J.; Draws, T.
Transparency Paths - Documenting the Diversity of User Perceptions / Giunchiglia, F.; Kleanthous, S.; Otterbacher, J.; Draws, T.. - (2021), pp. 415-420. (Intervento presentato al convegno 29th ACM Conference on User Modeling, Adaptation and Personalization, UMAP 2021 tenutosi a Utrecht Netherlands Online nel 21 - 25 June 2021) [10.1145/3450614.3463292].
File in questo prodotto:
File Dimensione Formato  
2021 FairUMAP-Transparency paths.pdf

accesso aperto

Tipologia: Pre-print non referato (Non-refereed preprint)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 670.15 kB
Formato Adobe PDF
670.15 kB Adobe PDF Visualizza/Apri
3450614.3463292.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 683.36 kB
Formato Adobe PDF
683.36 kB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/319677
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? ND
social impact