AI Evolving within Educational Landscape
AI: Friend or Foe in the World of Education
The Great Educational Turning Point
Across
classrooms around the world, a quiet revolution is occurring. Artificial
intelligence (AI) is gradually integrating into the fabric of teaching and
learning sometimes subtly, through tools like adaptive quizzes, and other times
more boldly with fully AI-driven tutoring systems. Whether we realise it or
not, education is on the brink of a digital transformation that could redefine
how we learn, teach, and connect with one another.
For
neurodiverse learners those who think, learn, and interact differently due to
variations such as autism, ADHD, dyslexia, or dyspraxia, this revolution
presents both opportunities and challenges. AI-powered tools are often praised
for their ability to personalise learning experiences, but the key question
remains: Can they genuinely honour the diversity of the human mind? The real
challenge lies not in whether AI should be part of education, but in how we can
shape it to support every learner while preserving the essential elements of
teaching empathy, creativity, and connection.
The Promise of AI:
Empowering Neurodiverse Learners
AI’s
greatest strength lies in its ability to personalise learning. For decades,
educators have sought to tailor lessons to meet students' diverse needs, but
time, resources, and class-size constraints have hindered this goal. However,
AI is starting to bridge that gap. With adaptive learning platforms, real-time
analytics, and speech-recognition technology, teachers can now gain valuable
insights into how each learner absorbs and applies knowledge.
For
neurodiverse learners, personalisation can be life-changing. Students with
dyslexia can benefit from text-to-speech and predictive spelling tools, which
reduce cognitive load and allow them to focus on understanding rather than
decoding words. Learners with ADHD can utilise AI-driven focus assistants that
break tasks into manageable steps, help track progress, and provide gentle
reminders to stay on task. Additionally, emotion-aware tutoring systems can
interpret facial expressions and adjust feedback to prevent feelings of
frustration or withdrawal.
Research
supports the promise of inclusivity in education. Al-Hendawi (2025) found that
AI-enhanced teaching strategies lead to improved engagement and autonomy for
students with special educational needs. Virtual tools like VRAILEXIA
(Tzafilkou & Perifanou, 2024) integrate AI and virtual reality to help
dyslexic students through immersive reading environments that adapt to their
sensory profiles. Therefore, AI becomes not just a technology, but an ally for
accessibility, opening new avenues for learning for students who have often
been misunderstood by traditional, one-size-fits-all schooling.
Teachers as Designers of Human-Centred AI
AI can
process data, but it cannot love, inspire, or understand human potential. This
is where teachers remain irreplaceable. As AI becomes more integrated into
classrooms, educators must assume the role of ethical designers, shaping how
technology interacts with students. The role of a teacher is evolving from
merely transmitting knowledge to creating learning ecosystems where AI acts as
a support system rather than a replacement.
In
this changing environment, it is essential to establish teacher-friendly
structures. Educators need clear frameworks to select, evaluate, and adapt AI
tools for inclusive practices. One such foundation is the Universal Design for
Learning (UDL) principles developed by CAST in 2018. UDL emphasises multiple
means of engagement, representation, and expression, which aligns seamlessly
with the inclusive potential of AI. By adopting a human-centred AI approach
informed by UDL, we can ensure that neurodiverse learners are not just
accommodated but empowered.
Consider Ms Rina, a middle school science teacher in Phnom Penh. She uses an AI writing assistant to help her neurodiverse students craft their research reflections. The AI provides structural support and vocabulary assistance, while Rina adds empathy, feedback, and cultural context. In her classroom, technology enhances rather than replaces her humanity.
When Algorithms Misunderstand Difference
The
same systems that empower individuals can also lead to exclusion. AI decisions
are only as unbiased as the data they learn from. If algorithms are trained on
limited views of "typical learning," they may pathologise
differences. For instance, a neurodiverse child who avoids eye contact might be
labelled as disengaged, while a student who needs to move to focus may be seen
as off-task.
These
algorithmic blind spots underscore the urgent need for ethical guidelines.
Armstrong (2017) emphasises that neurodiversity should not be viewed as a
deficit, but rather as a valuable form of human variation. When AI fails to
recognise this, it can unintentionally reinforce stigma. Additionally, privacy
risks are significant; systems that gather behavioural data on neurodiverse
learners may inadvertently expose sensitive information about cognitive
profiles or diagnoses.
UNESCO
(2023) emphasises that schools should view AI as a tool for inclusion rather
than a means of surveillance. The future of education relies on our capacity to
critically examine AI to ensure that its principles align with human rights,
dignity, and diversity.
Stories from the Classroom: Small Shifts, Big
Impact
The
most profound evidence of AI’s impact often comes not from laboratories, but
from classrooms. Take Leo, a 10-year-old student with ADHD. Writing assignments
once brought tears and frustration. With an AI planning app, he now breaks
essays into colour-coded chunks and uses voice-to-text to capture ideas before
they vanish. His teacher reports newfound confidence and creativity — not
because the AI taught him differently, but because it gave him space to
think differently.
Or
Maya, a dyslexic student in secondary school, who uses an AI summarisation tool
to distil complex texts. Instead of feeling overwhelmed, she can now discuss
themes with peers and engage critically. For her, AI acts as a literacy bridge,
not a crutch.
Such
stories reveal a critical insight: inclusion does not mean standardisation. It
means designing systems that honour the rhythm, pace, and style of every
learner’s mind.
The Human Equation:
Empathy in the Age of AI
Inclusive
education is grounded in empathy through the teacher’s ability to notice struggles
and curiosity that technology cannot fully detect. While AI can
analyse patterns, it cannot grasp purpose; it can predict behaviour, but it
cannot recognise potential.
The
challenge, therefore, is finding a balance. As AI tools become more advanced,
educators must be more intentional in their approach. The goal is to build
AI-empowered communities that use technology to enrich learning, not replace
human experiences.
Empathy
also extends to teacher training. Professional development should not focus solely on technical skills but should also include ethical awareness: the ability to question AI decisions, interpret outputs, and involve students, especially those who are neurodiverse, in co-designing solutions. In this way, AI literacy
serves as an inclusive pedagogy.
The Way Forward: Building Inclusive AI Literacy
To
ensure AI remains a friend, education systems must prioritise AI literacy
for inclusion. This literacy involves three interlocking dimensions:
1. Critical
understanding – Teachers and students must grasp how AI systems make
decisions and where biases arise.
2. Creative
application – Educators should experiment with AI to enhance
flexibility, not conformity, in learning design.
3. Ethical
reflection – Schools must uphold transparency, privacy, and dignity,
ensuring that no learner’s data is treated as a commodity.
Governments
and teacher education institutions can play a transformative role by embedding
inclusive AI ethics and neurodiversity awareness into curriculum
frameworks. Educators, technologists, and neurodiverse advocates can
collaborate on AI design that incorporates student input.
As
CAST (2018) advocates, universal design is not just an educational method but a
moral stance: a belief that learning should adapt to people, not the other way
around.
Conclusion: The Heart of the Matter
Is AI
a friend or foe in education? The answer depends on how we use it. AI is
neither a saviour nor a villain; it reflects the intentions of those who
control it. When guided by empathy, ethics, and inclusion, AI can serve as a
bridge to opportunity, especially for neurodiverse learners whose talents may
not be measured by traditional standards.
However,
if used carelessly, AI can deepen existing divides, turning data into labels
and students into profiles. The challenge ahead is not technological; it is
human. We must ensure that every algorithm in the classroom serves a greater
purpose: to cultivate curiosity, compassion, and connection. The future of
education will not be shaped by code alone, but by the courage of teachers and
students to envision a more inclusive world.
References
Al-Hendawi, M. (2025). Artificial
intelligence applications in special education: A systematic literature review
(2019–2024). Social Sciences, 14(5), 288. https://doi.org/10.3390/socsci14050288
Armstrong, T. (2017). The
power of neurodiversity: Unleashing the advantages of your differently wired
brain. Da Capo Lifelong Books.
CAST. (2018). Universal Design
for Learning guidelines version 2.2. CAST. https://udlguidelines.cast.org
Tzafilkou, K., & Perifanou, M.
(2024). VRAIlexia: A VR and AI system to support dyslexic higher
education students. arXiv preprint. https://arxiv.org/abs/2402.01668
UNESCO. (2023). Guidance for generative AI in education and research. UNESCO. https://unesdoc.unesco.org



Comments
Post a Comment