Pages Menu
Facebook
Categories Menu

Call for Papers: “Disability, Disability Studies, and Artificial Intelligence”

Manuscript Deadline: January 6, 2024

The Journal of Teaching Disability Studies (JTDS) invites papers related to the theme, “Disability, Disability Studies, and AI” for its fourth edition.

Background: Disability and AI

The meaningful involvement of disabled people as collaborative partners in research, design, and development has long been urged as the desired corrective for technoableism, Ashley Shew’s (2023) term for “a belief in the power of technology that considers the elimination of disability a good thing, something we should strive for” (p. 8) and what Bonnie Tucker (2017) has called technocapitalist disability rhetoric, “rhetoric in popular media and tech research that attributes agency to technology and tech companies and simultaneously revokes it from disabled people” (para. 2). But recent developments suggest that truly collaborative and participatory practices are difficult to achieve in the realm of AI.

Many AI systems collect and adapt to user data, leaving their users increasingly vulnerable to risk. In a Center for Democracy & Technology report on disability bias and discrimination in algorithmic technologies, Lydia X.Z. Brown and colleagues noted that “algorithmic technologies that are trained on data that already embeds ableist (or relatedly racist or sexist) outcomes will entrench and replicate the same ableist (and racial or gendered) bias in the computer system.” Yet, the risks remain exceedingly difficult to appraise. Because many AI systems are created and maintained for the goal of profit, they are often proprietary and closed, making them difficult—but not impossible—to comprehend, critique, and change.

In their report, Disability, Bias, and AI, prominent disability and technology scholars and practitioners working across industry and academia concluded: “Disabled people, along with other affected communities, must be at the center of any approach, defining the terms of engagement, the priorities of the debate, and retelling the story of AI from the perspective of those who fall outside of its version of ‘normal.’”

Teaching Disability Studies and AI

Within this broad context rests questions relevant to teaching and learning disability studies.

The ready availability of AI technologies and the speed at which they are being adopted in daily life underscore the urgent need for critical attention. While these technologies can serve as collaborative partners, it must be acknowledged that they pose underexplored risks. Further, their complicated entanglements with prevailing academic and institutional discourses of “fairness” and “rigor”—discourses that often depend on and help to drive academic ableism—merit our sustained critical attention.

Critical disability studies scholars and educators can contribute much to current conversations, including about bias in AI as well as its potential to support more inclusive teaching and learning and to challenge widely accepted distinctions between human learning and machine learning, or human intelligence and artificial intelligence.

By recognizing the interdependence and inseparability of bodyminds, objects, spaces, and systems, critical disability studies perspectives can create meaningful and impactful opportunities for anti-ableist intervention, making space for disabled people in the research, design, and development of these technologies.

To these ends, we invite manuscripts that broadly address:

  1. How might AI tools participate in disability studies teaching and learning?
  2. How do these tools align with the field’s social and political objectives? What tensions, opportunities, and questions arise in their meeting?
  3. How can disability studies students and educators, and our allies and community partners, contribute to the emerging global discourse surrounding these technologies and AI bias more broadly?

We seek manuscripts that take a critical and ambiguous stance regarding technology and access. While we recognize the value and importance of technology-mediated access, we invite prospective authors to attend to the “two frictional meanings of access” outlined by Aimi Hamraie and Kelly Fritsch (2019). Within mainstream rights-based approaches, access has often meant “disabled inclusion and assimilation into normative able-bodied relations and built environments” (p. 10). But placed within a critical and justice-oriented framework, access serves to challenge and disrupt normativity, centering “disabled peoples’ acts of non-compliance and protest” (p. 10). We encourage authors to consider the questions raised by J. Logan Smilges (2023): “What do we want access to, and why do we want access to it?” (p. 5). We are also interested in how disability studies educators and students who use, discuss, and critique AI grapple with and enact these questions in the context of neoliberal-ableist institutions.

Process and Tentative Timeline

Please see the Author Guidelines page of the JTDS website.

  • January 6, 2025: Submission deadline
  • May 2025: Authors notified
  • November 2025: Publication

References

AI Now Institute. (2019). Disability, bias, and AI.

Center for Democracy & Technology. (2022). Ableism and disability discrimination in new surveillance technologies: How new surveillance technologies in education, policing, health care, and the workplace disproportionately harm disabled people.

Hamraie, A., & Fritsch, K. (2019). Crip technoscience manifesto. Catalyst: Feminism, Theory, Technoscience, 5(1), 1-34.

Shew, A. (2023). Against technoableism: Rethinking who needs improvement. Norton.

Smilges, J. L. (2023). Crip negativity. University of Minnesota Press.

Tucker, B. (2017). Technocapitalist disability rhetoric: When technology is confused with social justice. Enculturation: A Journal of Rhetoric, Writing, and Culture.

Skip to toolbar