I Need to Get Smarter on AI
We are lucky to have a network of collaborators, and this week's article is written by Paul Burke. Paul is the Head of School at The Nightingale-Bamford school in New York City. Additionally, Leadership + Design is a 501c3 non-profit, and Paul is currently the chair of our board. Learn more about our board here.
Estimated Reading Time: 4-6 min
“I need to get smarter on AI.”
That was me. I said it. I said it at an end-of-year meeting with my Board President, nonetheless. “Get smarter on AI.” It sounded strong. On-the-spot. Leaderly.
She nodded in affirmation and, mercifully, asked no follow-up questions. We had other things to discuss.
Since then, I have tried to do the things one does to get smarter. I’ve attended some sessions. Listened to some of the pods. Read some of the articles. Talked with some of the people who are leading AI initiatives. I haven’t done a webinar, and I don’t see that changing. To webinar designers, I am the problem here. Most of the time, even a well-crafted webinar helps me make a dent in my inbox. Maybe AI could help me there?
Personal webinar hangups aside, AI is everywhere. Content about AI is being produced almost as quickly as the tools themselves. The volume alone can overwhelm and invite retreat.
And, if I was really honest about it all, part of me desires retreat, and this is not my typical mode. Perhaps because gaining AI understanding has a you better learn or else element to it. That element can be helpful and necessary and it probably has here, but it has a very different feel than explorations driven more fully by curiosity or growth.
It has only dawned on me now thanks to this opportunity to write. (Underlying lesson: take opportunities to write. Don’t turn it over to AI. Writing helps.) I think the reason that AI is hard for school leaders, regardless of our experience, is tied uniquely to the task before us.
I lead a k12 girls’ school. Last week we enrolled the Nightingale Class of 2039. Our goal is to do well by them when they are five, and do well by them when they are eighteen and graduating.
Here is the deal. That little kindergarten admit? She is to blame. It is totally her fault.
As much as I would like to tell her otherwise, I don’t have any clear sense of what her world will be in 2039. None of us do, even those of us who pay attention during webinars.
This moment calls school leaders not only to transform learning, but to do so while anticipating transformations across nearly every sector of society.
That is a big ask.
So what do we do?
The first thing we can do, I believe, is stake our claim.
Here is what I have come to believe quite strongly: school leaders need to lean into this AI moment in ways that leverage our actual expertise. If we don’t, we will quietly yield the space to others. And I am not convinced that is great for kids.
We steward learning inside a container called school — one first shaped by industrialists and politicians during the last major technological revolution.
In the centuries since, we have accumulated deep knowledge about how human beings grow, struggle, connect, and learn. Why would we downplay that understanding at precisely the moment it is most needed? Are we waiting until we feel “smart enough” about AI before we claim authority?
Maybe our incomplete knowledge about AI, combined with our more developed — though never complete — understanding of school, is precisely the kind of “good enough” we need right now.
We are still early. The high-level questions matter more than the technical mastery. As a full L+D acolyte, I believe if we ask the right design questions, we can iterate, own our authority, and learn our way toward good outcomes for children and teachers alike.
It is tempting to debate whether AI belongs in schools. I understand the instinct. But persisting there may heighten rigidity and fear among both adults and students. AI is here, and it is coming.
The second thing we can do is design the container AI enters.
Because we may not write the code, but we absolutely shape the context. We shape the language around it. The norms. The boundaries. The expectations. The spaces where it is used — and the spaces where it is intentionally absent.
We shape what intelligence means inside our walls.
In one classroom, AI might help a student receive immediate feedback on a math proof while her teacher moves between desks asking harder questions. In another, students might debate an AI-generated argument precisely because it is imperfect. In yet another, a teacher may simply close the laptops and say, “Today we think together.” The point is not the tool. The point is that adults who understand children decide how learning unfolds.
Here are a few questions I would love to see school leaders wrestle with together:
How might this AI moment help us examine where our centuries-old model of schooling has served students well — and where it has missed them?
How might we design schools that infuse AI to deepen understanding while strengthening what has always mattered most — relationships, belonging, and the raising of young people within a network beyond their family? To borrow my mother-in-law’s favorite metaphor: how might we strengthen roots even as we extend branches?
How might we tell our own AI stories before algorithms concoct them and then tell them for us?
Let’s take the first question for a moment.
If school leaders gathered to reflect honestly, we might arrive at shared admissions fairly quickly.
Admission #1: School has served some while missing others. The more successful the school, the fewer students it misses — but no model has been perfect.
Admission #2: We should be raising both the floor and the ceiling of learning.
Admission #3: Learning benefits from timely, individualized feedback — something our labor-intensive system has historically struggled to provide at scale.
Even in these early days — and even without attending a single webinar — I have come to believe AI holds potential to help us address each of these. Not by replacing teachers, but by amplifying our capacity to differentiate, to respond, to notice.
And yet, there is much we must preserve.
Over time, schools have been asked to do more than transfer knowledge. We are places where children are raised. We embed them in relationships that both grow and sustain. We give young people the opportunity to belong to a place, to encounter difference, to practice citizenship. Long before the term “social-emotional learning” was coined, schools were quietly doing that work.
As AI reshapes labor, authorship, and even notions of originality, schools must continue to serve as seedbeds for discernment and citizenship. Our students will need to know not just how to use AI, but when to question it, when to resist it, and how to remain authors of their own thinking in a world of generated content.
If we retreat from this moment because we do not yet feel expert, we risk allowing the story of intelligence to be defined solely by efficiency, productivity, and scale.
We know intelligence is more than that.
We know learning is more than that.
We know children are certainly more than that.
The third thing we can do is act.
This is not a call for technical mastery before action. It is a call for leadership in ambiguity. We do not need to be the smartest people in the AI room. We need to be the most grounded in what it means to raise and educate human beings.
If we bring that grounding to this moment — if we ask better questions, design thoughtful containers, and tell honest stories about what we are learning — we will not cede the future of schools to technological glamour.
We will shape it.
And we can do so not with certainty, but with authority rooted in experience, humility, and care for children.
That feels like enough to begin.