SciELO - Scientific Electronic Library Online

 
 issue94From the Silicolonization of the World to Technoconservatism: The Revival of Brazilian Political Extremism in the 21st CenturyThe Problem of the Image on Social Media: Scrolling, Algorithmic Montage, and Trolls in Times of Resentment author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

Related links

  • On index processCited by Google
  • Have no similar articlesSimilars in SciELO
  • On index processSimilars in Google

Share


Revista de Estudios Sociales

Print version ISSN 0123-885X

Abstract

SCHERVENSKI DA SILVA, Tiago Luis  and  GLOECKNER, Ricardo Jacobsen. Algorithmic Governmentality and Facial Recognition Technologies: Punitive Reconfigurations in the Brazilian Context. rev.estud.soc. [online]. 2025, n.94, pp.85-102.  Epub Oct 25, 2025. ISSN 0123-885X.  https://doi.org/10.7440/res94.2025.05.

This article analyzes the transformations of contemporary punitive practices in today’s social context, where data and algorithms have assumed a central role in dynamics of power and control. It addresses the question of what role algorithms play in current punitive practices and what effects they generate. To this end, it examines the use of facial recognition technologies in Brazilian public security, with a focus on the states of Bahia and Goiás. Based on a review of the literature, the study explores the contemporary biopolitical landscape, marked by the crisis of traditional mechanisms of confinement. In response to this crisis, new forms of control have emerged that move beyond penitentiary technologies and establish surveillance mechanisms in open spaces, grounded in security devices and the logic of risk management. Within this framework, data become essential to punitive practices, driving the development of increasingly sophisticated surveillance techniques. The article investigates how networked computer systems capture, process, and deploy this data, examining the underlying logic and rationality. The research seeks to understand how these devices can produce biased outcomes, particularly against racialized individuals and those in situations of gender vulnerability. The findings indicate that (i) the use of this technology embeds a discriminatory bias that reinforces racial and gender inequalities in the context of surveillance; and (ii) these technologies intensify surveillance and data management under a supposedly objective rationality that ultimately generates predictive profiles and delivers silent verdicts on individuals.

Keywords : algorithms; algorithmic governmentality; facial recognition technologies; punitive practices; social control.

        · abstract in Portuguese | Spanish     · text in Portuguese     · Portuguese ( pdf )