Available for invited lectures, workshops and policy dialogues on accessibility, design and governance.
Within the digital corridors of this website, you will observe that my professional journey has been dedicated to the pursuit of an inclusive world, primarily through the lens of physical infrastructure and legal governance. However, as I have noted elsewhere, beyond my work in accessibility audits, I draw on cinema and storytelling as spaces for reflection on empathy, transformation, and social change. It is this same fascination with the narratives that shape our lives that has led me to investigate a new and more insidious frontier of exclusion. This shift has prompted me to establish a dedicated space for critical enquiry: The Bias Pipeline. This project is a personal undertaking, representing a fusion of my private passions and my enduring commitment to social reform. This project, which you may find at biaspipeline.nileshsingit.org, represents my commitment to unmasking the intersection of disability rights, artificial intelligence, and the burgeoning field of technoableism.
The Bias Pipeline is not simply a repository of technical data; it is an analytical framework designed to scrutinise how automated systems and digital infrastructures can reproduce, and often amplify, historical prejudices. For many years, technology has been marketed as a universal panacea for the challenges faced by the disabled community. This ideology, often referred to as technoableism, frequently suggests that the solution to disability lies in fixing the individual through technological intervention, rather than addressing the systemic barriers inherent in our social and digital structures. Through this project, I seek to challenge these narratives and redirect the focus toward a rights-based approach to governance.
When you explore The Bias Pipeline, you will find a comprehensive examination of how bias is introduced at every stage of the algorithmic lifecycle. From the initial collection of data, which often excludes the lived experiences of disabled people, to the deployment of predictive models in healthcare, employment, and surveillance, the pipeline is fraught with risks. We examine how these systems can inadvertently penalise those whose bodies or minds do not conform to the normative data points favoured by developers. My work here is informed by my ongoing research and advocacy, where I continue to argue for safeguards that ensure AI serves as a tool for inclusion rather than a mechanism for further marginalisation.
I am immensely passionate about this aspect of our digital future, and I invite you to come and explore these complexities with me. Whether you are a researcher, a policy maker, or someone who is concerned about the future of human rights in the age of automation, your perspective is vital. The Bias Pipeline is a call to action for all of us to look beyond the technical novelty of AI and instead focus on the human impact. I encourage you to visit www.biaspipeline.nileshsingit.org and join me in this essential journey of discovery and reform. Together, we can work toward unmaking these technoableist systems and ensuring that the future of technology is truly inclusive for everyone.