top of page
Headshot Young Girl

Laurel Aguilar-Kirchhoff


United States

download (8).png
download (9).png


The rapid integration of AI and EdTech tools in K-12 education offers the promise of personalized learning but also presents a significant challenge for districts, schools, and educators. Ensuring student data privacy and navigating the ethical complexities of these technologies are paramount concerns. Many educators are eager to adopt these innovative tools but many don't fully understand the risks and complexities surrounding student data privacy. Understanding the scope of data collected by various AI tools, how it's used, and the potential for misuse can be daunting. Staying informed about privacy laws like FERPA, COPPA, and CIPA adds another layer of complexity for educators to understand and communicate to their students. Additionally, the increasing use of AI raises ethical concerns about potential biases within algorithms and the implications of large-scale student data collection. Schools face the task of evaluating these tools for both their pedagogical value and their potential to perpetuate existing inequities. Balancing the benefits of innovation with the responsibility to protect students' privacy and ensure ethical technology use is a complex and ongoing challenge in the digital age of education. Some questions to be addressed are: Are the algorithms used in these tools trained on diverse data sets to ensure equitable outcomes for all students? How do we address potential biases that may perpetuate existing inequalities? What are the ethical implications of collecting and analyzing student data on a large scale? Ultimately, the problem remains: How do we create an environment where teachers feel confident and informed about using EdTech and AI tools while prioritizing student privacy and ethical technology use? Can we develop a system that empowers educators to harness the benefits of innovation, fosters open communication between different stakeholders like educators and IT departments, and prioritizes critical evaluation of these tools for data privacy and potential biases?

bottom of page