According to the Pew Research Center, a quarter of U.S. teachers say that AI tools do more harm than good in K-12 education. But is it actually true, or is such a perception merely the result of not finding the right solutions to tackle the challenges of using AI in education?
It’s a question we at CHI Software think about all the time, since we specialize in developing custom EdTech solutions in the AI domain and beyond. Here’s why: when businesses and educational institutions view AI with skepticism, they lose on its benefits and opportunities. Although the challenges are real and complex, they are solvable — that’s the conclusion we are leaning into after we collected opinions from the experts, both internal and external.
In this article we share what we’ve learned and offer our experts’ takes on AI performance, security, cost, adoption, ethics, and ROI.
AI challenges look different in real systems. See how they play out in practice.
According to Carnegie Learning, 72% of educators say they feel concerned about AI safety –, with a further 22% saying they are “apprehensive” about the risks. At the same time, ML providers stress that they do not access any personal or business data for model training.
Current research estimates that implementing AI-powered tools across 26,500 schools in the UK could cost GBP 0.4 billion for initial setup and around GBP 1.2 billion per year.
Programs like the FIPSE Special Projects Program by the U.S. Department of Education could potentially finance AI support in education.
Only about 25% of schools train their staff on how to use ML tools, leading to low AI adoption and concerns among teachers.
Here’s a look at the main AI challenges in education and the ways to address them. Keep reading for a closer breakdown.
Challenge 1: Data Quality & Bias
“What are the two things plants need to make their own food?” — that’s the kind of question AI may generate for fifth-grade biology assessment if it hasn’t been trained on quality data. The phrasing in this question can confuse even adults, let alone children, because there are not only two things required for photosynthesis: among them, water, sunlight, carbon dioxide, proper soil conditions and temperature. That’s exactly why data quality remains one of the top AI challenges in education: without clear examples of good and bad outputs, the model simply guesses — and it’s teachers who know those examples best.
Expert Solutions: Create Gold-Standard Examples for AI
Solving the AI training puzzle comes down to a few essential pieces. Each one follows the same logic you would apply when teaching a human: to explain something, you need to have a “curriculum”, good or bad examples, and practice.
Ivan Kuzlo
Engineering Director
In our work, human-like training logic translates into a clear AI training sequence:
We start by cleaning and structuring raw materials such as course outlines and students’ responses. These materials form the foundation of our “AI curriculum.”
Once we’ve prepared the datasets, we create “gold-standard” examples that show an AI tool what good and bad outputs look like.
From there, we bring in educators to refine those samples. Teachers highlight the real classroom standards that AI outputs need to meet.
Based on educators’ refinements, we fine-tune the model and reduce hallucinations.
All in all, data quality challenges of AI integration in education boil down to these four steps. In practice, though, they may involve more nuanced technical complications.
EdTech Platform Stack: Essential Technologies for Modern Learning
Read more
Challenge Solved. Real-Life Case of Data Preparation
Our client needed to add an AI feature to generate accurate, curriculum-aligned assessment questions. The hurdle was in the volumes of AI training materials: CHI Software’s developers had to account for data quality, diversity, curriculum alignment, and fairness across varying student skill levels.
Data preparation is one of the biggest issues of AI in education — our engineers tackle it through repeated refinement cycles and well-defined quality rules.
To minimize the possibility of inaccurate AI outputs in this project, our engineers needed to run multiple refinement cycles and set clear quality rules, from difficulty levels to question structures. The results were promising: a well-trained AI model reduced the teacher’s manual workload by 50%. Also, when users were polled to see how they were enjoying the newfound diversity of AI-generated assessment questions, the survey found students’ learning experience had improved by 30%.
Challenge 2: Privacy & Security
Once an AI is up and running on your platform, another challenge immediately appears — making AI safe for your platform and users. 72% of educators say that they are concerned, according to Carnegie Learning, with 22% highly worried about AI security. Do you know how safe AI is for your students?
AI and Adaptive Learning: Personalizing Education with Data
Read more
Educators have good reason to be concerned: legal issues with AI in education may include data breaches of students’ emails, school information, records on students with learning disabilities, face scans, and voice recordings – unauthorized access to such data could amount to biometric identity theft or damage to a student’s academic reputation.
Expert Solution: Blend Industry Knowledge With Technical Skills
Yana’s take aligns with the insights from the UNESCO report on AI dilemmas: “AI futures are not preordained. They are built through policy choices, funding priorities, and evaluation standards.”
Challenge Solved. Case With A Secure Automated Grading Platform
Our client needed an automated grading feature, which requires data on students’ IDs and performance history. But any data that can reveal a student’s persona and academic records poses security risks of AI in education and falls under regulatory laws like GDPR.
The security challenges of AI in education can cause serious issues to businesses — our engineers address these risks by using encryption and anonymized logs.
To comply with GDPR standards in the project, our team applied encryption and anonymized logs. In this setup, the AI model is able to process the data, but not access it in plain text and not share it with any open-internet sources. Because the data had been encrypted, even our engineers could not see student identities; they could only work with anonymized datasets.
Challenge 3: Sufficient Infrastructure & Cost
The next challenge goes hand in hand with security obstacles: many AI tools run safely and efficiently on stable internet connection, modern devices, cloud services, and ongoing support. That’s where the financial AI issues in education stem from — not all school districts and EdTech companies can provide the required tech stack or invest in it.
The numbers behind AI outlays are indeed significant: the Tony Blair Institute for Global Change estimated that AI-powered tools could cost GBP 0.4 billion for initial setup and around GBP 1.2 billion per year for 26,500 schools in the UK.
Expert Solutions: Opt for Low-Cost Infrastructure
Although AI integrations do require some up-front investment, they can be manageable even on low budgets. A recent research published in the London Journal of Social Science points to opportunities to make AI more accessible:
Schools and small businesses can apply for grants. Governments and international organizations regularly fund projects in line with the UN’s Sustainable Development Goals. You can apply for such programs since AI tools align with the priority of educational development. For instance, the U.S. Department of Education manages the FIPSE Special Projects Program, providing grants to higher education institutions exploring AI applications in education.
Institutions can resolve connectivity issues with renewable energy sources. With or without AI, governments regularly invest in national electrification programs. Meanwhile, small solar energy solutions may be enough to fix internet connection gaps at some schools facing connectivity issues.
You can opt for AI-powered tools that can work on low-cost infrastructure. Many AI-powered tools run efficiently on smartphones and support low-connectivity scenarios. Phi-3-mini by Microsoft is one example of a lightweight and still highly functional AI model.
“Phi-3 is not slightly cheaper, it’s dramatically cheaper, we’re talking about a 10x cost difference compared to the other models out there with similar capabilities,” says Sébastien Bubeck, Microsoft’s vice president of GenAI research.
If infrastructure costs feel like a blocker, it’s worth exploring AI options that align with your technical and financial reality.
Request a consultation
Challenge 4: Teacher Training & Adoption Readiness
When an organization is ready and can afford AI tools, next in the line of challenges of implementing AI in education is solving the question of how teachers will adopt AI – how can teachers use AI professionally? “Teachers have a lot of questions about AI”, says Olha Kanishcheva, a PhD researcher in Computer Science. And that’s understandable: quality AI-generated content requires good prompting, and teachers often don’t have time to dive deep into technical details.
The adoption issues of AI in education come from a lack of training, as only about 25% of schools train their staff to use ML tools.
Digital Transformation in Education: Driving the Future of Learning
Continue reading
Expert Solutions: Develop Hands-On Workshops
What does good AI training mean for teachers? What will actually support their work? When a 2025 qualitative study asked teachers these and other similar questions, educators clearly requested hands-on workshops, best practices for specific AI tools, and help with using prompts.
In practice, teachers’ requests often come down to simplifying or automating manual work. Oleg Baidakov, our technical lead for EdTech AI Implementations, firsthand:
“When we started working with a major K-12 platform, we discovered that teachers were spending up to 40% of their time just on assessment creation.”
At CHI Software, we experienced working on a case developing a digital assessment platform to automate the process. After delivering the tool, we recommended that the client train teachers to take the approach of gradually weaving the platform into their regular work routine.
Challenge Solved: Tech Providers and Institutions Collaborate
Federal and national-level governances are already reacting to the AI challenges in higher education training and beyond. In particular, the American Federation of Teachers has launched a National Academy for AI Instruction in partnership with Microsoft, OpenAI, and Anthropic. Similarly, the National Education Association received a grant from Microsoft to expand educators’ AI literacy.
Separate educational institutions have also caught on to the trend: The University College of the Cayman Islands has already held a campus-wide AI training for faculty, and the University of Toronto has run workshops on generative AI tools for teachers.
Challenge 5: Ethical Risks
One reason why teachers do not fully understand or are hesitant to adopt machine learning tools is often concerns over the ethical issues of AI in education. Worries indeed run deep within the whole educational system as governance bodies work on developing clear ethical guidelines. Here are the main risks from the Ethical Guideline on AI for educators by the European Commission:
Human agency and oversight: Is the teacher role clearly defined so as to ensure that there is a teacher in the loop while an AI system is involved?
Transparency: Is it clear which aspects AI can take over, and which are not within the system’s scope?
Diversity, non-discrimination, and fairness: Is the system accessible to everyone who needs to access it, in the same way without any barriers?
Accountability: Who is responsible for the ongoing monitoring of results produced by the AI system?
Expert Solutions: Partner Engineers and Educators in Solving Ethical Risks
Ivan Kuzlo, engineering director at CHI Software, offers a few practical solutions to ethical challenges both from tech providers and educational institutions:
On our side, the ethical challenges of AI are fully technological — we make sure AI outcomes pass our quality checks and pay close attention to data management for AI training.
Ivan Kuzlo
Engineering Director
At the same time, Ivan shares that the best examples of ethical AI use tend to put engineers into close dialogue and collaboration with educators. Here are a few practical solutions to ethical AI risks in education Ivan usually highlights for clients:
Make AI actions explainable to teachers and students. When adopting AI tools, make sure that everyone in the classroom understands how the system reaches its conclusions.
Place human oversight for every AI action. The teacher must remain in control of the educational process: that includes all assessments and recommendations.
Train educators in AI literacy and responsible use. Any quality training process must draw a line between automation and a full control delegation.
Run regular data audits. Once a machine learning model is trained on an organization’s data, the entire pipeline should be reviewed at least once per quarter. Test the model regularly for bias and inaccuracies on a small scale.
Align AI uses with curriculum. Al should not only serve convenience – but clearly support teachers’ educational goals.
Challenge Solved: Our Experience With Fair Grading Feature
Grading open-ended responses is hard even for humans. Teaching AI to do so in an ethical and unbiased way is even harder. In one of our cases where we worked on an automated assessment platform, CHI Software’s engineers had to understand the meaning, nuance, tone, and context students used in their assignments. Our engineers addressed this challenge with fine-tuning AI models that had been trained on paraphrased and rewritten content in order to effectively spot suspected instances of plagiarism to factor into grading.
Ethical risks become manageable when engineers and educators work together — let’s look at your setup.
Request an expert review
Challenge 6: Measuring AI’s Impact
Businesses and institutions may pay more attention to problems with AI in education than to its benefits due to measurement issues. How can you really be sure whether your students’ engagement rates have improved thanks to bringing generative AI into the loop, or because you invested in personalized education? When the AI output isn’t clear enough, the investment value comes into question.
Expert Solution: Track Mid-Level KPIs for AI Performance
Yana Ni, chief engineering officer at CHI Software, thinks that the measurement challenges of integrating AI in education end when institutions track the whole cycle from mid-level to high-level business metrics.
Challenge Solved: Track Time, Engagement, and Business Expansion Rates
At CHI Software, we deal with all the problems of using AI in education discussed above regularly. But we also see the other side of the coin: big-picture insights like new applications that our clients are discovering for AI, what metrics they are tracking, and which results are catching their attention. Here are a few measurable metrics you can think of, based on our clients’ experience:
The case with a chatbot for educational institutions resulted in a 40% decrease in teachers’ time spent on manual content creation and a 25% improvement in decision quality.
In another case with an online educational solution for kids, multilingual support and optimization helped a client expand business to nine countries.
This illustration shows how you can address the challenges of implementing AI in education step by step: by adding guardrails, human oversight, and measurable KPIs before scaling AI tools.
Conclusion
As you see, businesses, educational institutions, and governments are working together to make AI a supportive assistant to educators, not a threat. The challenges are still real, but the thoughts of diverse experts we consulted with flow in the same current: issues with AI in education are solvable with appropriate tech skills.
At CHI Software, we have the skills to resolve the challenges standing between your business and effective AI assistants. These skills are at your disposal as soon as you contact our team to get a quote on what it would take to start strengthening your platform with AI, without any of the typically associated downfalls.
FAQs
How can AI improve education without replacing teachers?
AI is not designed to replace teachers. It is designed to replace human effort at manual tasks, including:
- Offloading repetitive workload. AI can grade short as well as long-form answers, generate questions, sort, and adapt the materials.
- Surfacing insights. ML can notice patterns in mistakes or early signs of struggles that might go unnoticed by teachers.
- Personalizing learning. Artificial intelligence can stand beside each student and resolve the “one teacher per 30 students” issue.
- Delivering constant guidance. Algorithms can help, support, and advise during long-night study sessions to make them actually shorter.
How do students and teachers typically respond to AI features?
In our experience and knowledge of the recent research papers, the reactions are mainly positive. From what we’ve seen:
- Students are often the first ones to push AI adoption because they will be the ones to use AI to understand their material faster.
- Teachers appreciate AI for help with lesson preparation, assessments, and administrative work.
- At the same time, educators are cautious about safety risks in AI features and the overreliance on AI.
- By and large, students and teachers tend to adopt AI for minor tasks like seeking information, and blend AI with their personal work for more serious tasks.
What’s the biggest technical challenge in adopting AI for education?
When asked this question, our engineers agreed that the security issues have been the most challenging to solve, including:
- Integrating AI with learning management systems and the school or platform’s databases, which requires strict access control;
- Keeping track of all FEPRA, COPPA, and GDPR compliance;
- Keeping student data anonymous, while still extracting generic points from it to train AI.
The most effective practices our AI engineers use to address these challenges are encryption and using secure data storage, as well as providing access controls, regular audits, and compliance checks.
What if my team doesn’t have in-house AI expertise?
Most EdTech teams start exactly there. This situation is more manageable than it may look to you at the moment. Here’s what you can do for the first steps:
1. Clarify a single problem you want AI to solve. Be specific enough, go for goals like “reduce grading time for short essays” rather than “improve time-management”.
2. If you can, map the current user journey. Draw the current flow of how teachers, students, or admins touch the process you want to improve.
3. Define what success will look like to you. Before you get in touch with an external vendor, make sure you understand which metrics will be able to prove that the tool is effective.
4. If possible, prepare gold-standard examples. While you are choosing vendors, your educators get a head start by starting to sort the best grading practices.
5. Create a short vendor brief. Put the above points on the page and prepare your quality criteria for vendors.
How can we balance AI innovation speed with platform stability?
Balancing comes from protecting the core platform’s functionality and from testing every add-on on a small scale first. Here are some practices we recommend:
- Adding modular AI components that can be switched on and off without disrupting main system functions.
- Testing new models in isolated sandboxes before launching them to your target audience.
- Rolling out new AI features gradually so that you can monitor their performance.
- Collecting teachers’ and students’ feedback to make insight-driven decisions instead of simply relying on your own judgments.
Ivan keeps a close eye on all engineering projects at CHI Software, making sure everything runs smoothly. The team performs at their best and always meets their deadlines under his watchful leadership. He creates a workplace where excellence and innovation thrive.
Yana oversees relationships between departments and defines strategies to achieve company goals. She focuses on project planning, coordinating the IT project lifecycle, and leading the development process. In their role, she ensures accurate risk assessment and management, with business analysis playing a key part in proposals and contract negotiations.
Every industry adopts new technologies at a different pace. The educational sector tends to take on new ways of operating quickly, as it often needs to keep up with the brightest minds. That’s why institutions and startups continually seek ways to optimize their processes, including through custom education software development. But when it comes to software updates, there’s still one...
Wealth is not in materials, but in ideas. That may sound like just an aphorism, but it’s really true – and the data back it up. According to the World Economic Forum, over 80% of the market value of S&P 500 companies is non-physical assets such as data or software. The ratio has completely flipped from the previous century, when...
Generative AI is already reshaping how people learn. A recent study by Pew Research shows that about a quarter of U.S. teens turned to ChatGPT for schoolwork in 2024 – twice as many as in 2023. With numbers like that, it’s no surprise that demand for ChatGPT integration in the educational sector keeps growing. What will we see by the...
We use cookies to give you a more personalised and efficient online experience.
Read more. Cookies allow us to monitor site usage and performance, provide more relevant content, and develop new products. You can accept these cookies by clicking “Accept” or reject them by clicking “Reject”. For more information, please visit our Privacy Notice
Solving the AI training puzzle comes down to a few essential pieces. Each one follows the same logic you would apply when teaching a human: to explain something, you need to have a “curriculum”, good or bad examples, and practice.