Who owns the commercial outcome of your AI
The most common structural failure in mid-market AI isn't the wrong tool or vendor. It's that nobody owns the commercial outcome. And activity without ownership is just cost.
Why can nobody answer what AI is producing?
I was in a board meeting last year when the chair asked a simple question.
"What is AI actually producing for us?"
The room went quiet. Not because people did not know AI was happening. Everyone knew. IT had deployed two tools. Marketing was generating content with a third. Sales had started using an AI prospecting assistant. The innovation team had a pilot running somewhere.
The problem was not a lack of AI activity. The problem was that nobody in that room could connect any of it to a commercial number.
The CTO talked about adoption rates. The CMO mentioned efficiency gains. The head of sales said it was "early days." The CEO looked at me and said what I have heard dozens of times: "We are spending money on AI and I cannot tell the board what it is producing."
That is not a technology problem.
That is an ownership problem.
It is the most common structural failure I see in mid-market organisations right now. Not the wrong AI. Not the wrong vendor. Not the wrong use case. The wrong answer — or more often, no answer at all — to the question: who owns the commercial outcome?
I have spent 26 years building, advising, and occasionally dismantling technology strategies. The pattern is consistent. When nobody owns the commercial outcome of a technology initiative, the initiative drifts. It consumes budget. It occupies people. And it produces nothing the board can point to.
AI is no different. It is just faster and more expensive.
What does the accountability vacuum look like?
Here is what I see in most mid-market organisations that are "doing AI."
IT runs the infrastructure. They own deployment, security, integration, uptime. They are measured on system reliability, not revenue. When you ask IT what AI is producing commercially, they will tell you the platform is live and adoption is growing. That is a technology answer, not a commercial one.
Marketing has deployed AI tools independently. Content generation. Campaign optimisation. Audience segmentation. The marketing team can tell you they are producing more content faster. They cannot tell you whether that content moved pipeline. They are measured on marketing metrics — traffic, engagement, MQLs — not on whether the AI they deployed contributed to a closed deal.
Sales is experimenting. An AI prospecting tool here, an email assistant there. Individual reps are using tools the organisation has not sanctioned, let alone measured. Nobody is tracking whether these experiments are producing commercial outcomes or just producing more activity.
And somewhere, an innovation team is running a pilot. It has a steering committee. It has a quarterly review. It has a slide deck. It does not have a P&L owner.
This is the accountability vacuum. Every function is deploying AI. Every function can justify its own activity. But nobody — not a single person — is accountable for the commercial outcome of AI as a strategic capability.
The CEO cannot get a straight answer because there is nobody whose job it is to give one.
I want to be clear about something. This is not a failure of people. I have seen talented, committed leaders in every one of these functions doing their best with the mandate they have been given. The failure is structural. The organisation has adopted AI without deciding who owns what it produces.
Why does shared accountability not work?
When leadership teams recognise the vacuum, the first instinct is almost always the same.
"Let's make it a shared responsibility."
A cross-functional AI council. A steering committee with representatives from IT, marketing, sales, and operations. Monthly meetings. Shared objectives. Collective ownership.
I understand the appeal. It feels collaborative. It feels inclusive. It feels like the way modern organisations should govern a cross-cutting capability.
It does not work.
Shared accountability without decision rights is not ownership. It is diffusion.
Here is what actually happens. The committee meets. Each function presents its AI activity. There is discussion. There are actions. Nobody has the authority to kill a failing initiative because nobody owns the outcome. Nobody can redirect budget because the budget sits in functional silos. Nobody reports to the board with a single commercial number because nobody has the mandate to produce one.
The committee becomes a coordination layer without decision power. It produces alignment in theory and drift in practice. Six months in, the CEO is still asking the same question. And the committee produces a deck with activity metrics from four different teams.
I have seen this pattern in at least a dozen organisations. The committee is not the solution. The committee is a symptom of the same problem — nobody has been given the authority and accountability to own the commercial outcome.
The ownership question needs to be worked through — deciding how to build, not just deciding what to build. And the decision about who owns the outcome is the build decision that most organisations skip entirely.
What does commercial ownership actually mean?
So if shared accountability does not work, what does ownership actually mean in this context?
It does not mean technical control. The owner of AI outcomes does not need to understand model architecture or deployment pipelines. That is IT's domain and should remain so.
Ownership means commercial accountability.
One person who answers three questions. What did this produce? Was it worth the investment? Should we scale it or stop it?
That person must have decision rights. Not influence. Not a seat on a committee. Actual authority. Budget authority to fund or defund initiatives. Initiative authority to start, scale, or kill. Reporting authority to stand in front of the board and give a straight answer.
It is not so much about deciding what to keep and what to kill. It is more about deciding where to allocate the capital and resources that you have. That allocation logic requires a single point of accountability. Without it, capital gets distributed by consensus, which means it gets distributed by politics.
If the person who owns your P&L does not care whether a specific AI initiative succeeds, nobody owns that initiative. It is running on organisational momentum, not strategic intent.
And momentum without direction is just expensive drift.
I am not prescribing a title. I am not drawing an org chart. I am stating a principle: commercial AI outcomes require single-point accountability with decision rights. The specific structure will depend on your organisation, your market position, your leadership team. But the principle does not vary.
This is not about hierarchy. It is about clarity.
What does ownership clarity enable?
When one person owns the commercial outcome of AI, several things change immediately.
Initiatives get funded or killed based on commercial signal, not political consensus. The AI council can still exist as a coordination layer, but decisions flow through someone with the authority to act on them. Experiments that produce no commercial signal get stopped before they consume another quarter of budget.
The board gets an answer. Not four answers from four functions. One answer from one person who is accountable for it. "AI produced this commercial outcome last quarter. Here is what we are doing next. Here is what we stopped." That kind of clarity changes the board's relationship with AI from scepticism to confidence.
The second and third AI bets happen faster. When the first initiative produces a visible commercial result and someone is accountable for that result, executive confidence compounds. The next proposal does not need six months of committee review. It needs a business case reviewed by the person who owns the outcome.
Budget stops leaking into undirected experiments. Not because any single experiment costs a fortune, but because the aggregate spend across IT, marketing, sales, and innovation adds up to real money with no aggregate return. Single-point ownership makes that spend visible. And visibility is the precondition for discipline.
Perhaps most importantly, the organisation starts treating AI as a commercial capability rather than a technology experiment. That shift in thinking — from "we are doing AI" to "AI is producing commercial results" — is the difference between activity and advantage.
The ownership question is not question seven in a long list. It is question one. Before you choose a vendor. Before you approve a pilot. Before you allocate budget. If you cannot answer it, everything that follows is built on a structural absence.
What question should you take into your next meeting?
Here is what I want you to do with this.
The next time your leadership team discusses AI — the next initiative, the next budget allocation, the next vendor conversation — ask one question before anything else proceeds.
"Who is accountable for the commercial outcome of this?"
Not who is managing the project. Not who is providing the technology. Not who sits on the steering committee. Who is accountable for what it produces commercially, with the authority to act on that accountability?
If the answer is "it's shared," challenge that claim. Shared accountability without decision rights is the reason you cannot answer the board's question today.
If the answer is "IT," challenge that framing. IT owns the infrastructure. That is not the same as owning the outcome.
If nobody can name the person, you have found the problem. And it is a bigger problem than which AI to deploy or which vendor to choose.
If the ownership question leads you to ask whether your organisation needs a dedicated role for this, that is a different conversation worth having. If it reveals a gap between your technology teams and your commercial teams, that deserves its own examination.
But start here. Start with the ownership question.
Because until someone owns the commercial outcome of AI in your organisation, you do not have a strategy.
You have activity.
And activity without ownership is just cost.