This article originally appeared on CoSN’s blog and is reposted here with permission.
Key points:
In recent years, school districts have shown increasing interest in the potential of Generative AI (GenAI) to revolutionize education. GenAI offers the promise of enhancing personalized learning, streamlining administrative tasks, and providing innovative educational resources. However, as districts rush to adopt these cutting-edge technologies, they must carefully select the right AI tools to meet their unique needs. This rapid adoption brings significant risks, particularly regarding data privacy and accessibility.
Ensuring that AI tools protect student data and comply with accessibility standards is crucial for creating an inclusive and secure educational environment. This blog post will explore expert recommendations for selecting GenAI tools, helping districts navigate these challenges effectively.
Data privacy considerations and recommendations for GenAI adoption in schools
Linnette Attai, Project Director for CoSN’s Student Data Privacy Initiative and President of the compliance consulting firm PlayWell, LLC, shares insights on data privacy risks associated with adopting GenAI tools and offers guidance for responsible implementation.
While security breaches are a common concern, Linnette emphasizes that protecting students’ privacy and data involves more than just avoiding breaches. There is a broader responsibility to safeguard students’ emotional well-being and personal information, or as she calls it, a ‘responsibility of care.’ Key privacy considerations include:
Ownership and control of data:
District leaders should be cautious when using large language models not specifically designed for educational purposes. These models might use student data to further train the AI, raising concerns about the commercial use of personal information and potential exposure of sensitive data. In addition, for some districts, any type of commercial use of personal information is unlawful.
Linnette advises districts to adhere to fundamental practices when adopting new tools:
- Have a clear objective: Despite the growing popularity of GenAI tools, districts should identify a specific reason for their use. This approach ensures that the tool aligns with district needs and maximizes its impact on student outcomes.
- Be informed before testing: Districts must thoroughly understand the tool, including its privacy practices, security measures, and contract terms, before making a commitment. Especially, districts must ensure that the tool will be used solely for educational purposes.
- Start with staff: Testing AI tools with staff, rather than students, helps avoid premature exposure of student data. Some companies offer beta tests or sandboxes for staff to simulate student experiences, which can be a valuable way to assess the tool’s effectiveness.
A practical example: Hinsdale Township High School District 86
Keith Bockwoldt, Chief Information Officer for Hinsdale Township High School District 86 in Illinois, shares his district’s thoughtful approach to GenAI. Keith’s ‘Reimagining Learning through Innovation’ program allows teachers to pilot new tools funded by the district’s IT budget. Teachers submit proposals for evaluation, which are assessed for compliance with data privacy policies before pilot implementation. Teachers must then provide evidence of the tool’s impact by year-end, and the department reviews whether the tool should be adopted more broadly.
Keith highlights two critical considerations:
- Vendor compliance: He ensures vendors are aware of and comply with data privacy policies, such as the Student Online Personal Protection Act (SOPPA). He discusses data protection measures, including data purging and storage practices.
- Ongoing vendor engagement: Continuous communication with vendors is crucial for maintaining compliance with data privacy standards.
Ensuring accessibility
Jordan Mroziak, Project Director for AI and Education at InnovateEDU, emphasizes the need for a deliberate approach to adopting new technologies. He warns against the educational arms race of adopting unproven or potentially unsafe AI products. Instead, districts should focus on meeting the needs of all students, particularly those who are underserved or disadvantaged. As a helpful resource, Jordan shared his and his colleagues’ work with the EdSAFE AI Industry Council, which aims to offer guidance and reliable standards for districts exploring GenAI tools. Companies join this alliance by demonstrating how their products adhere to the SAFE framework for AI, which focuses on safety, accountability, fairness, equity, and efficacy. This collective effort ensures that AI tools are developed with these critical principles in mind, promoting responsible and effective use.
Additionally, the recent update to ADA Title II requires that accessibility is prioritized from the beginning. Districts must choose AI tools that comply with ADA standards and ensure equitable access for all students. This process includes assessing tools for adherence to accessibility guidelines, involving diverse stakeholders in testing, and making adjustments to meet various learning needs. By addressing these requirements proactively, districts can ensure that their AI tools are inclusive, effective, and legally compliant, thereby maximizing technology’s benefits for every student.
For more recommendations on accessible GenAI implementation, read Blog 5 of this series: Adapting to ADA Title II: Effective Strategies for Accessible AI in Education.
The integration of Generative AI tools into education offers significant opportunities for enhancing learning and efficiency. However, it also poses challenges related to data privacy and accessibility. Thoughtful implementation and ongoing evaluation are essential to maximize the benefits of these tools while ensuring the protection and support of all students.