News Release

Professors call for further study of potential uses of AI in special education, avoiding bans

Technology could greatly help students with disabilities in writing; policy, research should consider upsides

Peer-Reviewed Publication

University of Kansas

LAWRENCE — Artificial intelligence is making headlines about its potentially disruptive influence in many spaces, including the classroom. A group of educators that includes a University of Kansas researcher has just published a position paper reviewing AI’s potential in special education, calling for patience and consideration of its potential uses before such technology is banned.

Most importantly, AI should be considered as a tool that can potentially benefit students with disabilities, according to James Basham, KU professor of special education, and co-authors. Tools such as ChatGPT can quickly turn out writing. And naturally, some students have used that to avoid schoolwork.

But banning it is not the answer.

“It’s really been over the last decade or so that we’ve seen AI and machine learning move from just what you might call geek culture to the bigger world,” Basham said. “We’ve been studying it, but ChatGPT made it a little more real by making it available to the public. While we think the writing process is complex, AI can do it, quickly and fairly well.

"When you think about people with disabilities in education, you often think about writing. We get referrals all the time for students who can’t or struggle to express themselves in writing. And AI can help with that. So we need to think about what questions we need to ask or issues to think about.”

In the paper, the authors provided a brief history of artificial intelligence and how it developed to its current state. They then considered ethical questions regarding its use in education and special education and how policy should address the technology’s use. Foremost, schools should not reflexively ban the technology, the authors wrote. Meanwhile, educators, researchers and others need to think about what they want students to learn and how the technology can aid that process. Additionally, teacher educators who are producing future generations of educators need to work with their students to consider how they can effectively address the topic.

Among the main ethical considerations is information literacy, the authors wrote. Students need to learn how and where to find valid information as well as how to discern true information from false, think critically and assess topics to avoid misinformation. Educators should also avoid the trap of evaluating skills like writing too narrowly.

“If we’re only having students do things in one certain way, the AI can probably do that," Basham said. "But if we’re bringing in multiple concepts and modalities, then it’s a much different conversation. We need to think about who we are as a society and what we teach, especially when we think about students with disabilities, because they are often judged on just one aspect.”

The article, published in the Journal of Special Education Technology, was co-written with Matthew Marino, Eleazar Vasquez and Lisa Dieker, all of the University of Central Florida, and Jose Blackorby of WestEd.

The authors also urged those in education to consider AI and if it is a “cognitive prosthesis” or something more. Just as a student with physical impairments might use speech-to-text to translate their thoughts more efficiently to writing or a student with a hearing impairment can use an app on a phone to turn down ambient noise in the classroom, a student with cognitive disabilities could potentially use AI to improve their writing.

But while technology can help students improve writing and other skills, educators need to consider consent, the authors wrote. All students should be taught about what information any AI collects, how it is stored and how it is shared. Parents have a role to play in that regard as well, in considering whether a school that uses AI is right for their child, if it complies with an Individualized Education Plan and if it can be personalized while being respectful of diverse student backgrounds and values, the authors wrote.

The authors also noted that AI already exists in schools: Students use laptops, tablets, smartphones and other technologies unavailable to previous generations. Yet those tools are not banned from classrooms outright. Similarly, while technologies such as ChatGPT could be used to cheat or reduce student workload, they could also potentially be an effective resource for students with disabilities. Before any such judgments are made, researchers and policymakers should continue to ask questions and ensure people who represent students with disabilities are at the table, the authors wrote.

“Technology is a societal experiment," Basham said. "We can use it effectively or ineffectively. But the education system needs to get in front of it and figure out how to use this particular technology to further human betterment. What we need is not to be afraid of change but to think about critical thinking and problem-solving so we are teaching students to do that whether with AI or without. We need to reflect not on today on how it will change our lives, but what it means for the future.”


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.