Table of Contents
This article is part ongoing series published by the author via LinkedIn.
Okay. This one might sting, but I feel it's necessary for us to discuss! I really need to talk to you about a word you keep using lately: 𝑨𝑰.
A lot of you seemed to be putting that on all over your products. Your dashboard is now "AI-powered." Your reporting too is now "AI-driven." Your quiz generator is now "AI-enhanced." So I have a question: 𝒘𝒉𝒂𝒕 𝒘𝒂𝒔 𝒊𝒕 𝒖𝒔𝒆𝒅 𝒕𝒐 𝒃𝒆?
Because I have been in this work long enough to remember when the exact same feature was called an algorithm. Or automated analytics. Or adaptive learning. It looks like you changed the label, but I am not sure you changed the core product or its functionality.
Recently, I heard a term 𝑨𝑰-𝒘𝒂𝒔𝒉𝒊𝒏𝒈. I didn't coin the term and don't even like it, but it seems to be what is happening a lot of late. I am not saying you are doing that deliberately. Some of you genuinely believe the label fits.
But here is what happens on my end.
Recently, I sat in a product demo. I saw a button that said, "AI-powered." So I asked which Large Language Model it was built on. Or what that button does or meant. The vendor said, "I am not sure. I will have to get back to you." I never even got to my next questions: was this model being trained on student data, and did everyone consent to that?
I am not asking these questions to be difficult. I am asking them because 𝗜 𝗮𝗺 𝗿𝗲𝘀𝗽𝗼𝗻𝘀𝗶𝗯𝗹𝗲 𝗳𝗼𝗿 𝗲𝘃𝗲𝗿𝘆 𝘁𝗼𝗼𝗹 𝘁𝗵𝗮𝘁 𝗺𝘆 𝘀𝘁𝘂𝗱𝗲𝗻𝘁𝘀 𝗮𝗻𝗱 𝘀𝘁𝗮𝗳𝗳 𝘂𝘀𝗲.
To me, the AI label is not a minor marketing choice. It changes what I have to ask. What I have to investigate. What I have to explain to parents. What I have to document.When you put GenAI on your product without being able to explain it clearly, you are not making my job easier. You are adding a new layer of risk I now have to manage.
I want to be clear: 𝗜 𝗯𝗲𝗹𝗶𝗲𝘃𝗲 𝗶𝗻 𝗚𝗲𝗻𝗔𝗜'𝘀 𝗽𝗼𝘁𝗲𝗻𝘁𝗶𝗮𝗹 𝗶𝗻 𝗲𝗱𝘂𝗰𝗮𝘁𝗶𝗼𝗻. I even wrote a book about it. I love technology. In fact, my favorite job was being a tech coach for a large district before becoming an administrator. I absolutely believe in the power of well-designed technology. I am just not sure about whether WE are going about it the right way.
So here is what I am asking. Before you put AI in your product name, your pitch deck, or your subject line, please answer:
➡️ What does the AI actually do in your product, specifically?
➡️ What data was it trained on, and did your users consent to that?
➡️ What happens inside your system when the AI gets it wrong?
I have a lot more questions, but I will just start with these three. Because I believe that the word AI is not a finish line. It should be treated as a responsibility for all users. Are you treating it like one?
Kip Glaser is a high school principal in Silicon Valley, using her experience and knowledge to help school leaders and educators with EdTech and AI.
Kip is the Author of Ready to Lead with AI: A Practical Guide for School Leaders, which offers practical guidance for educators wrestling with how best to use AI effectively in our schools. This book serves as the roadmap school leaders need to prepare for the future of AI while keeping student success at the forefront of AI-related work.
