When school districts craft or update their policies on the use of artificial intelligence, they should set clear expectations but leave room for students and teachers to make mistakes, according to superintendents who have been leading their schools through establishing guidelines around the use of the powerful technology.
AI has taken the education community by storm in recent years, particularly since ChatGPT came on the scene a little over a year ago, able to whip up entire essays in seconds, alter and create images, and solve complex math problems.
Still, for all the talk about AI, most districts are still trying to figure out the technology鈥檚 implications for plagiarism and data privacy, and develop guidelines for its ethical use.
But district leaders who have taken the plunge and begun to grapple with these challenges said it鈥檚 important to be clear about expectations, particularly for staff members so they have some license to experiment within reasonable boundaries.
Most often, because AI is rapidly evolving and changing and to allow for this experimentation, the expectations the district leaders have set center on what not to do, rather than what to do, district leaders said recently in a panel discussion at the National Conference on Education sponsored by AASA, The School Superintendents Association.
鈥淭he biggest feedback we鈥檝e gotten is that our teachers want to know what鈥檚 OK or what鈥檚 not OK, not because they鈥檙e afraid of it, but because they want to make sure they don鈥檛 do something wrong,鈥 said Mary Catherine Reljac, superintendent of the Fox Chapel Area school district in Pennsylvania.
The big no-no鈥檚, Reljac said, include inputting students鈥 or colleagues鈥 personal information, or any specific identifiable information about the school district, in an AI tool. Staff members are also advised against downloading or using simply any available tool鈥攖here鈥檚 a specific process the district鈥檚 technology department uses to vet products first, she said.
From there, individual teachers should also have classroom-level policies that are molded to fit the subjects and content they teach, because AI can be used differently in different scenarios, said Patrick Gittisriboongul, assistant superintendent of technology and innovation in Lynwood, Calif.
鈥淚t鈥檚 important for every teacher, every subject area, every content area, every grade level to look at AI and start really anticipating what kinds of questions are going to come up from students,鈥 he said. 鈥淚t鈥檚 a matter of time that students and staff are going to be using the tool on a daily basis, if they鈥檙e not already.鈥
But district leaders need to keep an open mind, Reljac added, and understand that there is so much still in flux when it comes to AI.
鈥淎s the superintendent, I鈥檝e given people permission to mess this up, and we鈥檝e tried to reassure people that if you have good intentions and you鈥檙e taking good precautions, if what you鈥檙e doing doesn鈥檛 go well, let鈥檚 just talk about it and use it as a learning experience, instead of a punitive thing,鈥 she said. 鈥淭hat has helped to relieve some of the anxiety people have.鈥
Along with training and flexibility, it鈥檚 important that districts planning to allow and integrate AI into their classrooms consider what to do if a student鈥檚 parents opt them out of using the technology, the district leaders said.
It鈥檚 likely there will be some students in that situation, said Kelly May-Vollmar, superintendent of the Desert Sands school district in La Quinta, Calif.
She recalled when Chromebooks, now a staple in many American classrooms, first gained traction and some parents pushed back.
Communication is key in those situations, she said, encouraging superintendents, principals, and technology leaders to have individual conversations with parents to discuss their concerns. Often, they鈥檒l change their minds afterward, she said.
But if they don鈥檛, Reljac said, districts must find a way to allow students who don鈥檛 have access to AI to master the same standards and skills as their peers. That鈥檚 not a new concept鈥攑arents can opt their children out of any course content in her district, she said.
鈥淲hen that happens, our educators have to create a different way for the students to move forward, and we鈥檙e using the exact same approach here,鈥 Reljac said. 鈥淚t鈥檚 such a small number, but it鈥檚 an important number.鈥
May-Vollmar added: 鈥淭echnology is a tool. You have learning objectives and you decided on outcomes you wanted for students, which you can accomplish in 15 million different ways. Using generative AI is a way to do that. It鈥檚 not the only way.鈥
Regardless, students who want to use the technology and have their parents鈥 OK should have equal access, she said. That means district leaders have to ensure teachers across the district feel comfortable using AI and don鈥檛 shy away because it鈥檚 unknown, she said.
To encourage teachers who are more hesitant about the technology to test it out, May-Vollmar said her district has created 鈥淎I playgrounds,鈥 which allow educators to receive some instruction about AI and test it out in a low-stakes environment intended to be fun and encourage exploration.
There are stations with different focuses鈥攎aybe a specific tool or function鈥攁nd teachers rotate through, May-Vollmar said.
鈥淭hat kind of experience created a lot of momentum and encouraged our people who have been hesitant to use technology,鈥 she said.