C3PO from 鈥淪tar Wars.鈥 HAL from 鈥2001: A Space Odyssey.鈥 鈥淭he Terminator.鈥 And now Apple鈥檚 SIRI and Amazon鈥檚 Alexa. Artificial Intelligence has always been part of our collective imagination. But it鈥檚 now becoming part of our everyday lives.
There is, of course, a ton of hype. Experts think this new type of 鈥渕achine learning鈥 could help people do all sorts of things over the next couple of decades: power self-driving cars, cure cancer, cope with global warming, and yes, transform K-12 education and the jobs students are preparing for.
It鈥檚 too early to say how much of that promise will end up bearing out. But it鈥檚 a good idea for educators to get familiar with AI, whether they are the chief technology officer of a large urban district or a 1st grade teacher in a rural community.
So what exactly is AI? The simplest explanation is that AI trains a machine to do tasks that simulate some of what the human brain can do. That means it can learn to do things like recognize faces and voices (helpful for radiology, security, and more), understand natural language, and even make recommendations (think the algorithm Netflix uses to suggest your next binge-worthy TV show.) And, as the technology advances, much more could be possible.
How It Works
So how does that actually work? That鈥檚 a complicated question, in part because experts aren鈥檛 always on the same page about what AI is and what it isn鈥檛.
Right now, all sorts of technology, including educational software, is 鈥渁daptive.鈥 That means it鈥檚 pre-programmed to take certain steps, based on what a user鈥攕ay, a student鈥攄oes. In simple terms, if a kid taking an adaptive test gets an answer right, the system knows to give that kid a tougher question next. (Think of this as a much more sophisticated, computerized version of those choose-your-own adventure books you might have read as kid.)
Plenty of experts would call those systems 鈥淎I鈥 and plenty of vendors market their educational software that way. But these so-called 鈥渞ule-based鈥 systems aren鈥檛 the 鈥渇ancy, sexy AI鈥 that鈥檚 grabbing headlines, said Robert Murphy, a senior policy researcher for the RAND Corporation. That鈥檚 because all the information is already pre-programmed. The machine can鈥檛 get any better at a particular task.
Cutting-edge AI relies on systems that can actually learn, usually by analyzing vast quantities of data and searching out new patterns and relationships. Instead of following one already predetermined pathway, these systems can actually improve over time, becoming more and more complex and accurate as they take in more and more information.
Current Use of AI in Schools
How is AI being used in K-12 schools?
Classrooms are already using AI-powered tools鈥攊ncluding smart speakers, like Amazon鈥檚 Alexa or Google Assistant鈥攁s . And school districts are beginning to use the technology to do things like plan bus routes, screen applications for teaching positions, and even predict when a piece of HVAC equipment is likely to go bad.
But widespread use of much more sophisticated tools in the classroom is down the road, said Michael Chui, a partner at McKinsey & Company who has a deep background in computer science. Eventually, AI has the potential to individualize lessons for students 鈥渢he way a really, really awesome teacher does,鈥 Chui said. But he cautioned, 鈥渋t鈥檚 very, very early.鈥
Already, though, at least some form of AI is used in so-called smart tutors, which help schools differentiate instruction for different kinds of learners. In some cases, they can process natural language to interact with students. AI is also used in applications like automated essay scoring, and early warning systems, which can help identify which students are at risk of dropping out.
To be sure, education technology companies have even loftier ambitions for how AI might reshape K-12 education. Case in point: , a game-based classroom management tool, is teaming up with researchers at the University of Montreal to see if AI can find patterns in student engagement and use them to make suggestions to teachers right now, and the program allows teachers to give students 鈥減oints鈥 for positive behavior such as critical thinking, collaboration, and even empathy. The company and researchers are hoping to use that data to help teachers figure out when their students are less likely to be engaged and combat that problem.
Potential Trouble Spot: Bias
The use of artificial intelligence in education is expected to explode to a worldwide market value of $6 billion over the next six years. And about 20 percent of that growth will come from applications for K-12 teaching and learning in the United States, according to a report by Global Market Insights. What鈥檚 more, the McKinsey Global Institute predicts that 鈥攎ostly noninstructional job responsibilities, like tracking student progress and communicating with parents鈥攃ould be automated by 2030 with the help of AI.
But don鈥檛 expect an army of AI-powered robots to be filling teacher job applications at a district office near you. Andreas Oranje, a general manager in the ETS Research Division, said during a session at the International Society for Technology in Education鈥檚 annual conference this year that he expects AI will ultimately help educators perform rote tasks, not replace them.
鈥淢y hope for AI is we actually will expand teaching,鈥 Oranje said. 鈥淣o teacher ever lost her job because every kid had an iPad. We need more teachers, not fewer. The nature of teaching will change. But it doesn鈥檛 mean that 40 percent of teachers will lose their jobs.鈥
What are some of the problems with using AI in classroom technology? AI may be fancy and sexy, but it鈥檚 far from perfect. One big problem: Human biases can be written right into the algorithms that power AI and then amplified by the technology. What鈥檚 more, the data that these systems use also can be biased. That can lead the machines to inaccurate, discriminatory, and even racist conclusions.
How this plays out in the real world: facial recognition software, which is currently used for airport security and may even be deployed for school safety, is notoriously bad at identifying women and people of color. More troubling: Studies have shown that risk-assessment algorithms used to figure out criminal sentences tend to make harsher predictions about black defendants than white defendants. And Tay, a chatbot developed by Microsoft, was supposed to figure out how to emulate natural conversation by interacting with Twitter users. Instead, it began communicating in vulgar and racist hate speech.
Not the Ultimate Decisionmaker
Bias issues may not be such a big deal if an AI-powered system is trying to, say, predict what pair of pants a retail customer will buy next. But it is problematic if the system is trying to figure out whether a student should apply to a particular college or not, or suggest a specific lesson for an individual student, Oranje said.
That鈥檚 why AI systems鈥攅specially those designed for teaching and learning鈥攕houldn鈥檛 be the ultimate decider of what students learn or what their educational pathway should be, Murphy said. But AI can still be an important supplemental tool, he added.
鈥淢aybe 10 percent, 20 percent, 40 percent of the time [the system] will get it wrong,鈥 Murphy said. 鈥淚t will vary by system, but 70 percent of the time they鈥檒l get it right.鈥 The systems could still help districts individualize instruction, but educators need to remain the most important part of the equation.
And of course, there鈥檚 another big, obvious concern: Data privacy, especially for K-12 students. That鈥檚 something advocates on both sides of the privacy debate are keeping an eye on, as well as educators.
鈥淚鈥檓 super against this idea of 鈥榣et鈥檚 put an Alexa in the classroom,鈥 because you鈥檙e giving Amazon access to kids鈥 voices without parents鈥 consent,鈥 said Mary Beth Hertz, the technology coordinator for the Science Leadership Academy at Beeber in Philadelphia. 鈥淚 personally am trying to learn more about what ways should we use AI in the classroom, keeping in mind privacy concerns around the data. AI doesn鈥檛 work unless you are feeding it data.鈥
Tech Skills for the Future
Should students be learning skills sophisticated enough for them to get involved in creating AI? A lot of experts see that as the next frontier. Many are particularly interested in making sure that students from groups that have historically been underrepresented in STEM fields鈥攊ncluding girls and racial and ethnic minorities鈥攁re involved in creating AI, to help counteract the potential for bias.
A jarring but true fact: Vladimir Putin told millions of Russian school children that the nation that leads in AI 鈥渨ill be the ruler of the world.鈥 And China is striving to be the world leader in AI by 2030.
But in the United States, many schools aren鈥檛 even offering computer science courses, much less AI learning opportunities. Big barriers include lack of curriculum and instructional materials. Some experts and educators are trying to change that, including , a working group that is developing national guidelines to help schools figure out what to teach on AI. But for now, less than 100 schools in the country are offering some form of K-12 instruction in this area, experts estimate.
Another challenge even for relatively affluent, tech-savvy districts like Leyden High School District 212 outside Chicago: Finding educators who can teach those high-tech skills, particularly at a time when the teacher ranks in general are thinning in some parts of the country.
鈥淲hat we really need are teachers with a level of humility who are willing to learn alongside the students at this point,鈥 said Nick Polyak, the superintendent of the district. 鈥淭he traditional method of learning a topic deeply in college and then going on to teach isn鈥檛 relevant anymore because the knowledge is changing too quickly.鈥
But he sees figuring out the challenge as an imperative.
鈥淚 don鈥檛 want our students to be the people who just buy autonomous cars,鈥 he said. 鈥淚 want them to be the people who are designing and improving them. It鈥檚 imperative on us to provide an education that makes them ready to step into the evolving job market.鈥