Is Your Assistive Tech Biased?
Five years ago I was excited to sit at a table with a young Black student and her mother to show her all the things her child using a new robust augmentative and alternative communication (AAC) device could do.
She could tell us what she wanted to play with.
She could tell us her favorite color.
When one of her classmates was bothering her, she could tell them “stop.”
She loved it. The school loved it. Mom wasn’t sold.
“It doesn’t sound like her,” she objected.
Both of us knew this student’s mouth sounds were mostly squeals and cries. I opened the settings and showed her the choices: “Ella,” “Heather,” and “Tracy.” We listened to little clips of the computerized voices.
“They don’t sound like her.”
And she was right. There wasn’t a voice that sounded like someone that came from her family or community. Not a single voice that sounded like a young Black person, not on any system I could find. I could program a voice for her talker that sounded just like Yoda from Star Wars right then and there, but a Black American was too far fetched for assistive technology.
Because technology is programmed by people, who all have biases, our assistive technology has biases. And those biases are a danger to the UDL framework we use and in some cases, life threatening.
The speech-to-text software doesn’t work equally across all voices and varieties of English, especially Black voices.
The grammar checker flags non-white varieties of English.
The AAC lacks language from other dialects, cultures, and communities, and if it is there it is labeled as fringe. You want another language? It's available, but no one downloaded the file or attempted a translation.
The visual support makers are absent of vocabulary that is developmentally appropriate for all school aged children, such as words for sexual health, identity, and justice or they are locked behind a wall of “adult only.”
Indiana’s Article 7 Special Education law is explicit on how to figure out if a student can take home their AT: “On a case-by-case basis, the use of school-purchased assistive technology devices in a student's home or in other settings is required if the student's CCC determines that the student needs access to those devices in order to receive a free appropriate public education” (my emphasis added).
If your staff refer to a “school policy” or a hoop for families to jump through, such as an after-school training, you’re inviting bias into determining which kids get to talk, read and learn when the school bell rings at the end of the day.
Your word prediction program guesses the words that could follow “He is ___” are: good, smart, and mean, but “She is ___”: crazy, married, and pretty.
As we scrutinize our own biases, inherent tools and instruction we are welcoming into our classrooms and families:
- Listen to the people using the technology.
- Question your own biases.
- Take action. Engage your colleagues in what you’ve learned. Dialogue with the people creating the technology. Good developers are open to constructive criticism from consumers. My word prediction example was immediately discussed and corrected by the company. If they aren’t responsive to your concern about bias within their product, why would you want that in your room?
Our assistive technology has some problems created by humans. Humans can fix it.
Resources and Further Reading
Critical Practices for Anti-bias Education for K-12 Educators, Teaching Tolerance
Vocabulary for Socially Valued Adult Roles, Institute on Disabilities at Temple University
Ableism, National Conference for Community and Justice
Don’t Get It Twisted- Hear My Voice, ASHA Leader
Can't wait for culturally sensitive and appropriate voice options for my students of color. Not to mention having skintone choices. Thanks for writing this post.