Protecting the foundations of delivery in an AI-accelerated world.
I’ve spent a lot of years inside Agile systems, long enough to see small teams do remarkable things when the fundamentals were protected. Long enough to see those same fundamentals slowly whither as organizations grew and pressure increased. In all of those instances, the frameworks didn’t disappear. The ceremonies didn’t vanish completely. People still worked hard, and leaders still cared.
Agile didn’t collapse. It just got thinner in places, threadbare in others. Clear ownership blurred. The discipline that keeps teams aligned became negotiable. Speed became easier to measure than stewardship. Roles that once held the system steady were renamed, redistributed, or quietly deprioritized.
And then AI arrived.
What surprised me wasn’t that AI changed delivery. It was what it exposed.
AI didn’t rewrite the rules of teamwork. It magnified whatever foundation was already there. And occasionally, that magnification exposed places where many of us had become a little complacent.
As teams began experimenting with AI tools, I saw something else emerge.
There was genuine openness. It felt like a bit of a renaissance for teams. They were trying new assistants, trying new review tools, generating tests with AI, drafting entire sections of work. Exploration felt natural. It often felt exciting. I got right in there with it–no coding experience or anything,– just slugging away and asking LLMs to help me set up Agents, vibe coding, just crazy fun. . .
But there were no shared guardrails.
Some team members used AI cautiously — reviewing suggestions, strengthening coverage, improving clarity. Others generated large portions of their work. Some introduced entirely new tools that no one else was using. That’s when I began to notice that the variation wasn’t small.
And variation, without shared structure, multiplies variables. If you’re trying an experiment, too many variables ruin it.
The technical differences showed up in integration friction and inconsistent patterns. But the more disquieting shift wasn’t technical.
It was strategic.
An assumption began to settle in: if everyone has AI tools, we should be able to deliver everything faster. I understand the appeal of that thinking. Speed feels like progress. But going faster only helps if you’re building the right thing.
In the absence of clear product direction — in the absence of shared understanding of what “good” looks like — acceleration becomes noise. Teams start building in parallel. Individuals optimize locally. Stand-ups drift toward isolated reporting instead of alignment.
AI can inadvertently encourage isolated “productivity.”
However, delivery remains collective.
That’s when I realized my question had shifted.
For years, I had been asking how to help organizations adopt Agile more effectively. How to scale it. How to reinforce it. How to keep it from thinning out under pressure.
At times, I thought the answer was more reinforcement. More emphasis. More explanation.
But AI changes the leverage point.
The question is no longer just whether teams understand the ceremonies or artifacts. It’s whether they are excellent at the fundamentals beneath them — shared ownership, disciplined collaboration, product clarity, technical stewardship, and the habit of building together instead of in parallel. Those were the places in which we had become complacent.
If those fundamentals are strong, AI becomes a multiplier of the best parts of how a team works. If they’re weak, AI magnifies the gaps.
I’ve started thinking of this work as Foundations Coaching for AI Delivery. Not as a new framework. Not as a new certification. And not as a replacement for Scrum Masters or Agile Coaches, but as an evolution of stewardship.
Because in an AI-accelerated environment, the fundamentals aren’t optional.
They’re structural.
If you’re a Scrum Master or an Agile coach reading this, I don’t think this is about abandoning your role. It’s about widening it.
AI is now part of the delivery system. Whether formally adopted or quietly used at the edges, it’s influencing how work is created, reviewed, and integrated. The fundamentals you’ve always protected — clarity, collaboration, discipline, shared accountability — matter even more when acceleration increases.

The question isn’t whether teams are using AI. It’s whether someone is tending to how they’re using it together.
If you’re a leader, this work can’t live in isolation. It requires room to observe. Room to guide. Room to align tooling and expectations. It requires consistency in direction. It requires resisting the temptation to equate acceleration with progress.
AI will continue to evolve. Teams will continue to experiment. That’s healthy. It’s the thing that keeps me coming back.
But experimentation without shared structure fragments a system. What we need now isn’t tighter control, but it also isn’t unchecked acceleration. (no matter how seductive that may seem)
It’s stewardship.
Because AI is here. It’s not a pilot anymore. It’s embedded in the daily habits of delivery. When something that powerful becomes ubiquitous, the foundations beneath it matter more than ever.
AI doesn’t replace foundations. It magnifies them.
The question is whether we’re intentionally strengthening them — or simply hoping they hold.
I’ve outlined a minimal, open version of this idea on GitHub for those who want to explore or adapt it. I think my next addition will be a small 2-3 team experiment section I have been noodling on:
Here’s the living outline (minimal, principle-driven): https://github.com/ahinek/foundations-coach-ai-deliveryIt’s whether someone is tending to how they’re using it together.