Critical Reflection on Digital Inequalities in the Age of AI and Big Data

by Padi Falas-Maifala

In this reflection, I analyze Christopher Lutz's (2019) article "Digital Inequalities in the Age of Artificial Intelligence and Big Data," which offers an in-depth examination of digital inequality research, highlighting important discoveries and knowledge gaps, particularly regarding emerging technologies. His conceptual framework of first-, second-, and third-level digital inequalities provides valuable insights, though I find it occasionally limited by the very disciplinary boundaries he attempts to transcend.

Lutz's systematic organization of digital inequality studies reveals a significant shift in academic focus from basic access issues to skills development and ultimately to differences in outcomes. His observation that "digital inequalities tend to mirror existing social inequalities" (Lutz, 2019, p. 141) aligns with my perspective that social power imbalances are often amplified rather than mitigated by technology. However, I believe his framing occasionally inadvertently supports a technological determinism that views digital access as inevitably beneficial given the "right" conditions.

Particularly compelling is his discussion of mobile technology as "second-class Internet access" (Lutz, 2019, p. 142) using Napoli and Obar's mobile underclass theory. Lutz skillfully extends this concept to voice assistants and Internet of Things devices, characterizing them as both mobile "walled gardens" and even more restricted "prison yards" (Lutz, 2019, p. 143). This analogy effectively captures the paradox of technological advancement that simultaneously enhances and constrains user agency; a contradiction I frequently observe in my own critical assessment of technological platforms.

One area where Lutz's analysis could be strengthened is his examination of second-level digital sectionides related to emerging forms of digital labor. Despite acknowledging that "digital inequalities research has been slow in investigating the emerging digital economy" (Lutz, 2019, p. 143), his discussion of platform labor primarily focuses on issues of collective action and surveillance, without fully exploring how these systems fundamentally alter economic relationships and exacerbate precarity. Platform capitalism accelerates the formation of a new "precariat" class, as argued by Standing (2011); a factor deserving greater attention in digital inequality studies.

In my view, Lutz's section on third-level digital inequalities is the most thought-provoking, particularly his recommendation to combine digital inequality frameworks with critical algorithm studies. His finding that "data-driven algorithms often reinforce established structural inequalities rather than shaking them up" (Lutz, 2019, p. 144) resonates with my critical perspective that technological systems embed and amplify existing social hierarchies. However, I would go further to suggest that algorithmic systems' classification schemes and operational logics actively create new forms of inequality rather than merely reflecting existing ones (Noble, 2018).

I believe Lutz's analysis falls short in its consideration of agency and power dynamics. While he effectively illustrates how inequalities develop across these three sectionides, the article would benefit from a deeper theoretical grounding on questions of who benefits from these systems and how resistance might be organized. As Crawford (2021, p. 8) argues, AI systems are "a registry of power" that concentrates authority in the hands of tech companies and their state partners.

Conclusion:

In conclusion, Lutz provides a valuable framework for understanding digital disparities in access, skills, and outcomes; however, his work could be enhanced through greater attention to structural power dynamics. As digital systems increasingly mediate our social environment, research must examine how they not only reflect but actively produce societal inequality. Following Lutz's suggestions to expand digital inequality research, especially by incorporating critical algorithm studies and examining harms alongside benefits, can help develop a more comprehensive understanding of how technology reproduces and potentially challenges social inequalities.