TY - JOUR
T1 - Unpacking Algorithms as Technologies of Power
T2 - Syrian refugees and data experts on algorithmic governance
AU - Kasapoglu, Tayfun
AU - Masso, Anu
AU - Calzati, S.
PY - 2021
Y1 - 2021
N2 - The article explores algorithmic governance through the lenses of Foucault's work on governmentality. Algorithms are understood as “technologies of power” that literally “subjectify” the individuals upon which they act. Our main focus is on Syrian refugees in two national contexts - Estonia and Turkey - and we consider four types of algorithms to which refugees are subjected: relocation, police risk scoring, recommendation algorithms and online advertisements. The goal is to explore the “algorithmic imaginaries” of both refugees and Estonian experts who work with migration data about these technologies, via a series of interviews with 19 refugees and 24 data experts. Our findings show that, while relocation and police risk scoring algorithms are perceived as technologies of power responsible for producing macro-differences without paying sufficient attention to individual needs, recommendation and ad algorithms are seen as less threatening, i.e. as “technologies of the self”. From here, we suggest reconsidering algorithmic governance as an iterative practice to eventually transform the datafied “knowledge of the self”, suitable for algorithms, into a true “care of the self”.
AB - The article explores algorithmic governance through the lenses of Foucault's work on governmentality. Algorithms are understood as “technologies of power” that literally “subjectify” the individuals upon which they act. Our main focus is on Syrian refugees in two national contexts - Estonia and Turkey - and we consider four types of algorithms to which refugees are subjected: relocation, police risk scoring, recommendation algorithms and online advertisements. The goal is to explore the “algorithmic imaginaries” of both refugees and Estonian experts who work with migration data about these technologies, via a series of interviews with 19 refugees and 24 data experts. Our findings show that, while relocation and police risk scoring algorithms are perceived as technologies of power responsible for producing macro-differences without paying sufficient attention to individual needs, recommendation and ad algorithms are seen as less threatening, i.e. as “technologies of the self”. From here, we suggest reconsidering algorithmic governance as an iterative practice to eventually transform the datafied “knowledge of the self”, suitable for algorithms, into a true “care of the self”.
UR - http://www.scopus.com/inward/record.url?scp=85121458688&partnerID=8YFLogxK
U2 - 10.1016/j.diggeo.2021.100016
DO - 10.1016/j.diggeo.2021.100016
M3 - Article
SN - 2666-3783
VL - 2
JO - Digital Geography and Society
JF - Digital Geography and Society
M1 - 100016
ER -