Approaching Justice and Ethics in my Artistic and Programming Practice
As a student of web and game development, I've learned that every design choice I make and every line of code I write has ethical consequences. The digital artifacts we produce have an impact on behaviors, shape experiences, and have the potential to upend or reinforce established power structures. Drawing on design justice concepts and ethical UX frameworks, this reflection examines how I approach justice and ethics in my technical and creative work.
Understanding the Landscape: Where Ethics Meet Code
According to Ghanchi (2021), "The content that designers create represents our social thought, values, and culture. Similarly, any product's design embodies a value system, clearly indicating the designer's beliefs and moral principles." This reality has forced me to move beyond viewing my work as neutral technical exercises and instead recognize them as inherently political acts that shape social realities.
Costanza-Chock's (2020) concept of design justice fundamentally challenges the traditional notion of the designer as an isolated expert. The principle that "we prioritize design's impact on the community over the intentions of the designer" has become central to how I approach my work. This means moving beyond the "build it and they will come" mentality that Sambuli (2019) critiques, where technological solutions are assumed to automatically benefit everyone equally.
This principle has been particularly challenging for me as an artist and programmer who has often focused heavily on personal vision and technical elegance. It requires me to shift from asking "What do I want to create?" to "What does the community need, and how can my skills serve that need?"
In my programming practice, I've translated this to questioning not just how something works, but who it works for and who it might exclude. Every user interface, game mechanic, or web system embeds a value system, whether intentionally or not.
Recognizing and Addressing Dark Patterns and Bias
The prevalence of dark patterns in digital design serves as a cautionary tale for me working with technology. Mathur et al. (2019) found dark patterns across 11,000 shopping websites, revealing how widespread manipulative design has become. These patterns—from hidden costs to forced continuity—represent the ethical failures that occur when commercial interests override user wellbeing.
As an artist and programming student, I must actively guard against incorporating such patterns into my work, whether in interactive systems and games or web-based projects. This requires me to develop literacy around deceptive design practices and commit to transparency in my user interactions.
Equally important for me is addressing unintentional bias in my work. Bias can emerge through dataset selection, algorithmic design choices, interface assumptions, and countless other technical decisions. The key is building reflexive practices that help me recognize my own blind spots and assumptions before they become embedded in my creations.
Centering Community Voices in Development
The idea of putting "the voices of those who are directly impacted by the consequences of the design process" at the center of the Design Justice Network has completely changed the way I conduct user research and testing. I've started integrating community input from the very beginning of development rather than presuming I understand user demands.
For game development, this means considering diverse player experiences from the concept phase. Who are my characters? What stories am I telling? What assumptions about "normal" gameplay am I making? Costanza-Chock (2020) shows through the airport security example how binary assumptions get encoded into systems—from databases to user interfaces to algorithms. In games, similar binary thinking often appears in character creation systems, narrative structures, and gameplay mechanics that assume particular cultural contexts as universal.
Decolonizing Development Practices
The decolonizing design movement challenges me to question not just what I'm building, but how I'm building it. Rather than simply adding diversity to existing frameworks, decolonizing practice requires "epistemic delinking"—"decoupling oneself from a pure reliance on the Western canon and from Western design frameworks, methods, techniques, and practices" (Geyser, 2025).
The call to "delink from the present world-system" and develop "plural design practices" means actively seeking out and incorporating non-Western approaches to technology and storytelling. In practice, this involves:
- Diverse research sources: Moving beyond Anglo-European design frameworks to include indigenous and non-Western approaches to human-computer interaction
- Collaborative creation: Working with communities rather than for them, especially when developing content that touches on cultural themes
- Alternative success metrics: Measuring impact beyond traditional metrics like engagement time or revenue
Implementing Ethical Principles in Technical Work
Drawing from the principles outlined in the ethics readings, I've developed a framework for ethical development that includes:
Accessibility as Foundation
Following Vaidya's (2020) accessibility guidelines, I treat accessibility not as an afterthought but as a core design constraint. This means designing for screen readers, considering cognitive load, and ensuring content works across different devices and connection speeds.
Privacy by Design
Rather than asking users to trust me with their data, I design systems that minimize data collection and maximize user control. This aligns with Sambuli's emphasis on addressing "inequalities of access" by not requiring users to sacrifice privacy for functionality (Sambuli, 2019).
Transparency in Algorithmic Decision-Making
When implementing any automated systems—whether recommendation algorithms in apps or AI behaviors in games—I try to document how these systems work and what data they use. Users should understand when and how algorithms are making decisions that affect their experience.
Community Engagement and Participatory Design
Before beginning any project that will affect others, I now try to engage with the relevant communities early and meaningfully. This doesn't mean conducting surveys or focus groups, but rather building ongoing relationships and collaborative processes. As the Design Justice principles suggest, "We see the role of the designer as a facilitator rather than an expert" and "We believe that everyone is an expert based on their own lived experience."
Critical Technical Literacy
I'm working to develop deeper understanding of how my technical choices affect social outcomes. This means learning about topics like algorithmic bias, data privacy, accessibility standards, and the environmental impact of digital technologies.
Challenges and Ongoing Learning
Scale is one of the most difficult issues in ethical practice in the digital age. Digital creations may easily reach large audiences, unlike conventional artistic mediums. The decolonizing design literature reminds us that "there are no shortcuts, no quick byways through the modern world-system" for this work.
One particular challenge I face is balancing commercial viability with ethical principles. The gaming industry's reliance on engagement metrics and monetization can conflict with user-centered design. However, I've found that ethical design often leads to more sustainable and meaningful user relationships.
Moving Forward: Critical Code Studies
As suggested in the readings, developing a critical understanding of code means recognizing that programming is never politically neutral. Every technical decision—from database structures to user interface layouts—carries political implications. The goal is "to understand (the code you are writing and what it actually does) and articulate (how it fits into the broader socio-cultural framework) is to challenge and resist [and] create (meaningful technologies) and emancipate."
This means continuously educating myself about the social impacts of technology, staying engaged with communities affected by my work, developing reflexive practices that help me recognize my own biases and assumptions, being willing to have uncomfortable conversations about power, privilege, and responsibility, and remaining open to criticism and course correction.
Conclusion
Approaching justice and ethics in artistic and programming practice requires moving beyond individual good intentions toward systemic thinking about power, access, and representation. It means recognizing that technology is never neutral and that developers have a responsibility to consider the broader implications of their work.
The path forward involves centering community voices, actively working against exclusionary design, and remaining committed to ongoing learning and accountability. As Costanza-Chock (2020) reminds us, "we view change as emergent from an accountable, accessible, and collaborative process, rather than as a point at the end of a process."
As someone working at the intersection of art and technology, I have both tremendous opportunity and significant responsibility. My work can perpetuate existing inequalities or actively challenge them. The choice is mine, but it's not one I can avoid making. In the words adapted from the decolonizing design movement: "to understand and articulate is to challenge and resist [and] create meaningful technologies and emancipate."
This is not a destination but an ongoing practice—one that requires humility, critical self-reflection, and a commitment to justice that goes beyond individual projects to encompass our entire approach to creating digital experiences. The question isn't whether I'll have an impact; it's whether that impact will contribute to justice or work against it.