What is it and to whom does it apply?
On November 20, 2024, the US Department of Education (ED) Office of Civil Rights (OCR) released a new resource on Avoiding the Discriminatory Use of Artificial Intelligence.
The resource, which applies to all schools subject to federal civil rights laws, including K-12 and higher education institutions, provides information on the legal analysis OCR uses to determine whether a school is using artificial intelligence (AI) in a discriminatory manner. It provides examples of conduct that would be grounds for an investigation for various federal civil rights laws – including Title VI of the Civil Rights Act of 1964 (Title VI), which prohibits discrimination on the basis of race, color or national origin; Title IX of the Education Amendments of 1972 (Title IX), which prohibits discrimination on the basis of sex; and Title II of the Americans with Disabilities Act of 1990 (ADA) and Section 504 of the Rehabilitation Act of 1973 (Section 504), which prohibit discrimination on the basis of disability.
What does the resource say?
Unlike the toolkit that was released by ED’s Office of Educational Technology, which provides advice for mitigating risks, building strategies for AI integration and maximizing use of AI in schools, the OCR resource addresses how the agency will enforce anti-discrimination laws for schools’ AI use. The resource does not have the force of law but is key to understanding what may trigger an OCR investigation – which is important because, in addition to reputational harm, OCR investigations can result in significant fines and penalties for violations.
The resource includes 21 examples of incidents involving AI that could create the basis for on OCR investigation. Those examples include the following:
- Title VI: Using AI to check for plagiarism could trigger an OCR investigation if the tool results in allegations of plagiarism for students who are non-native English speakers. OCR describes an instance where an instructor uses AI to check for plagiarism, but the tool has a higher error rate for non-native English speakers, resulting in more of those students’ essays being flagged, and those students facing disciplinary action or receiving a lower grade. OCR would have grounds to investigate the school for a violation of Title VI because the AI tool impacted English learners’ ability to participate in the class.
- Title IX: Failing to respond to a student’s use of AI to create fake explicit images of other students that are viewed and discussed at the school could trigger an OCR investigation. Schools have an obligation to respond to and prohibit sex-based harassment, and the resource provides that it may be a Title IX violation if a school fails to adequately respond to and prohibit harassment of the subjects of such images.
- Section 504: Using a generative AI tool to write Section 504 plans for students with disabilities could trigger an OCR investigation. The resource provides that a school’s use of a generative AI tool to write Section 504 plans could violate the school’s obligation to provide students with disabilities a free appropriate public education if the school does not review or modify the generated Section 504 plans to meet the specific needs of each individual student.
How should schools and edtech companies prepare?
Schools
Schools need to adequately evaluate AI tools before using them to determine whether there is a risk of discrimination. Additionally, schools should educate faculty and staff about potential risks before permitting use of such tools in the classroom. Having a human element – like teachers – checking the impact of tools in real time may bolster an argument that the school responded reasonably to the extent that it cannot detect discriminatory impact before use in the classroom. Further, schools need to properly investigate complaints about potential discrimination in schools resulting from the use of tools both inside and outside the classroom.
Edtech companies
Edtech companies should be aware of these potential issues when developing their technology and take steps to address them early, before they create an issue. Ensuring products are not discriminatory will be important for the long-term success and reputation of the company. When contracting with schools, edtech companies should be prepared for schools to ask about the impact of their technology, so they should be prepared to explain how AI technology will be used, along with being accurate and clear about the impact. Additionally, edtech companies should create robust acceptable use policies that allow them to suspend a user’s account for any inappropriate or prohibited behavior. The company can then consult with the school regarding a student’s conduct to determine whether to permit access again, after a student engages in prohibited conduct, such as inappropriate images that would constitute sex discrimination.
Election impact
AI is not going away, and neither is the bipartisan focus on safety for students and children using AI tools. Technology is rapidly evolving, and the law continues to try to keep up, which may result in ED and other agencies issuing more guidance like this while laws and regulations are developing.
While AI will remain top of mind, early indications are that the new administration’s enforcement strategy for violations of civil rights laws will likely differ from the current administration’s. Because the OCR resource is a guidance document and not a formal regulation, the new administration has the prerogative to modify it, rescind it or simply not enforce it. Precedent exists from the previous Trump administration, during which ED rescinded a 2014 Obama-era Dear Colleague Letter addressing racial disparities in student discipline.
Moreover, if, the Trump administration is not as focused on proactive enforcement by OCR, fewer investigations will be initiated even if this resource remains unchanged. Likewise, if the incoming administration prioritizes efforts to dismantle diversity, equity and inclusion (DEI) programs in schools and discourage classroom discussions on topics such as race and gender, the content schools and edtech companies permit on AI tools will be impacted. Finally, anticipated changes to Title IX will impact what is deemed sex discrimination and how schools should treat it when it occurs in AI tools.
Regardless of the new administration’s approach, schools and edtech companies will need to work together to ensure tools are nondiscriminatory, and should look to this resource for guidance as long as it remains in effect.