AI Will Make Your Marketing Team Faster. And Worse.

Three converging forces are opening a gap most CMOs haven't named yet.

AI Will Make Your Marketing Team Faster. And Worse.
Photo by Seafairy7 / Unsplash

I had dinner recently with a room full of senior marketers. Smart people. Big brands. The conversation was sharp, fast, and almost entirely about tactics.

Which platforms. What formats. How to win the algorithm. Who to hire. How to outsource faster.

Nobody asked the question underneath: what happens when the platform changes again? What have you actually built?

It wasn't a bad conversation. These are real problems, and the people in that room were solving them well. But there was a gap between the quality of the diagnosis and the depth of the response. I keep seeing it in how large companies respond to structural shifts. They see the change clearly. They respond to it intelligently. And the response never quite reaches the level the problem demands.

That gap matters. Because right now, three forces are converging that change the shape of the problem entirely.

Three shifts, one pattern

Social collapsed the funnel first. Someone sees a product in a TikTok comment, checks the brand on Instagram, reads three reviews on Reddit, and buys through a link in someone else's story, all before the brand team knows it happened. Discovery, research, service, and purchase happen in a single interaction. The model where brand lives with a creative agency, performance with a media agency, and social with the digital team no longer maps to how people actually experience you.

AI is collapsing it further. Conversational AI sits in the middle of this blurred funnel. When someone asks ChatGPT or Gemini for a recommendation, they skip the awareness stage entirely. They don't visit your website. They don't see your ads. They get an answer, and if your brand isn't part of that answer, you don't exist.

After ChatGPT launched, Stack Overflow traffic plunged. AI could replace pure information-seeking. Reddit held steady. Community and lived experience can't be replicated. The same logic applies to brands. Anything purely informational is vulnerable. Anything built on judgement, relationship, or genuine experience is defensible.

And the horizon that makes both of these shifts urgent rather than gradual: the customer may not be human. AI agents are beginning to make purchasing decisions, not just recommendations but actual transactions. When the customer is a machine, the rules change entirely. Your content, your product information, your brand presence all now serve two audiences. One human, one machine. Built for different logics. Evaluated by different criteria.

"Bot psychology" is already a field. This isn't guesswork anymore.

The tactical trap

The diagnosis behind these shifts is excellent. Senior marketers understand that social has changed the game. They can feel that AI is shifting how people find brands. The smarter ones are already thinking about AI agents. Stefano Puntoni (HBR) published a sharp analysis recently of how conversational AI and AI agents are reshaping marketing, then closed with the most important line in the piece: "Treat this as a leadership issue, not a marketing-department problem." The diagnosis was right. Every recommendation that preceded it was tactical.

A separate HBR study of 35 senior leaders found that 93% identified "human factors" as the primary barrier to AI adoption. That framing tells you where the thinking stops. "Human factors" is what people say when they can see the problem is about people but can't name the structural gap underneath. The issue isn't the humans. It's that nobody has built the infrastructure for humans to operate in.

And that's the pattern. In every case, the response is the same.

  • Social collapsed the funnel? Hire a social agency. Build a content studio. Outsource to someone who knows TikTok.
  • AI is displacing search? Build Generative Engine Optimisation (GEO) capability. Hire an AI consultant. Optimise your structured data.
  • AI agents are becoming customers? Study bot psychology. Restructure your metadata. Prepare for machine-readable commerce.

None of these responses is wrong. You probably do need a data scientist. You probably do need someone who understands GEO. The problem is what happens after the hire. In most companies, the specialist delivers. They run the analyses, build the models, handle the work that nobody else can do. The capability stays in their head.

💡
When the step change actually happens, it's because the specialist's job isn't just to do the work — it's to embed their judgement broadly enough that the wider team's decisions get better. That almost never gets built into the brief.

There's a useful way to see why these responses keep falling short. Sequoia Capital published an analysis recently that splits professional work into two types: intelligence and judgement.

  • Intelligence work is rule-based. Complex, skilled, but the rules are rules. Campaign setup. Performance reporting. Audience segmentation. Media buying.
  • Judgement work is what sits above the rules: deciding what to test, reading what the data means, knowing when to hold a brand position and when to shift.

Every one of those tactical responses is an intelligence-layer fix. Hire someone who knows the rules of TikTok. Buy a tool that handles the rules of GEO. Outsource the rules of machine-readable commerce.

The problem is that AI is coming for the intelligence layer faster than most marketing leaders have grasped. Software engineering crossed the threshold first. Over half of all AI tool usage across professions sits in that one field. Every other field is catching up. Within the next year or two — faster in some categories, slower in others — most of the intelligence work a marketing team does today will be handled by tools that are faster, cheaper, and more consistent than any team. Campaign optimisation, content production, performance analytics, media buying within set parameters. The tools already exist. Adoption is the only variable.

So here is the question nobody at that dinner was asking: when the tools handle the intelligence work, what happens to the judgement?

The gap that opens

Most marketing teams have never had to answer that question, because the intelligence work and the judgement work were tangled together. When your team spends weeks building a campaign, they learn things along the way. What the audience responds to. How the creative lands. What the data suggests about the next move. The doing teaches the thinking. Slowly, unevenly, but it happens.

When AI handles the doing, that accidental learning path disappears. The team gets the output without the process. They get the answer without the working. And working is where judgement develops.

Ask a CMO what their team will do with the time AI frees up and the answer is always the same: strategic thinking, creative experimentation, brand building. The reality, in most companies I've watched adopt new technology, is different. People freeze. Not because they lack ability, but because their professional identity was built on being excellent at things machines now handle. Or they fill the time with more of the same work at lower resolution. Or they defer to whoever is loudest in the room, because nobody has built the infrastructure for making strategic calls at scale.

This is the part that should worry CMOs more than any platform shift.

💡
AI will make marketing teams faster. It will also, unless something deliberate is built, make them less capable of the decisions that matter most.

Not because the people are worse, but because the thing that taught them to think has been automated out of their day.

I watched it happen when marketing automation arrived. A team I worked with got a system that could test and optimise at a speed they'd never had before. Within a year, they were running more tests than ever and learning less from each one, because nobody had built the thinking infrastructure around what to test or what the results meant. The tools got better. The decisions didn't.

The cost of that gap doesn't show up in any quarterly report. It shows up as teams that are efficient but directionless. Marketing functions that produce more content than ever and build less brand equity than ever. Decisions that get made by default rather than by design, because the people closest to the work have the tools to execute but not the infrastructure to decide.

And here's the competitive version of the problem: the company that builds judgement infrastructure will compound what it learns from every AI tool it deploys. The company that doesn't will buy the same tools, run the same campaigns, and wonder why nothing accumulates. Same spend. Widening gap.

What this actually demands

Small companies and social-native brands don't have this problem. When you're twenty people, everyone already operates across the full funnel. There's no gap between brand and performance because the same person does both. There's no handoff between social and customer service because they sit at the next desk. And there's no shortage of judgement because everyone is close enough to the work to develop it naturally.

But as companies grow, they specialise. They fragment into departments. They build depth (which is valuable) but lose the horizontal integration that made them agile when they were small. By the time you're a thousand people, brand, performance, service, and commerce live in different teams, different agencies, different reporting lines. The ability to hold it all together doesn't exist in any one place.

The convergence exposes this fragmentation. It demands the agility of a startup at the scale of a multinational. No amount of tactical hiring or outsourcing solves that, because the gap isn't in any single function. It's between all of them.

You can't workshop your way to distributed judgement. It has to be built as infrastructure: principles clear enough that people across the company can make good calls without waiting for the expert to weigh in. I wrote last month about why most approaches to this don't scale – and what the ones that do have in common.

That's what "treat this as a leadership issue" actually means. Not that the CMO should pay more attention to AI, or that the board should add a digital agenda item. It means building the judgement infrastructure that lets a thousand people make good decisions in situations nobody designed for, and keep making them as the ground shifts again.

The uncomfortable question

There's a test I keep coming back to: does what you've built keep working when the builders leave? Here, it applies with a twist. The question isn't just whether the capability survives the departure of the people who built it. It's whether it survives the departure of the platform it was built for.

Most tactical investments fail that test. What was built for social doesn't hold for AI. What was built for AI won't hold for whatever comes after. If the capability can't flex, it was never infrastructure. It was a workaround with a shelf life.

These aren't three separate challenges. They're three symptoms of one structural shift: the marketing operating model that most large companies have run for twenty years no longer maps to how value is created. The temptation is to respond to each one individually, a social strategy, an AI strategy, a GEO strategy, each with its own team, agency, or consultant.

That's rebuilding. Starting from scratch each time the ground shifts, because nothing from the last round was designed to flex. The alternative is calibration: infrastructure that adjusts rather than breaks when the context changes. The point isn't whether you'll keep transforming. It's whether you're building something that calibrates or something you'll have to replace in three years.

💡
The real question is whether you can build capability that doesn't depend on any specific platform, any specific partner, or any specific person staying in the room.

If you're on a board or running a marketing function, there's a version of this you can test on Monday morning: ask your team what they'd do differently if every piece of intelligence work they do today were automated by September. If the answer is "more strategic work" but nobody can name what that means in practice, you've found the gap.

And if you get different answers from your board, your executive team, and your marketers, you've found the gap before the gap.

The tools are about to get very good at the intelligence work. The question is what your team is becoming.