Agentic commerce

AI still does not understand what your eyes decide in half a second

The next shift in ecommerce is not only about prettier product pages. It is about translating what humans understand instantly into facts an agent can actually use.

Today I was explaining what I do to a friend who runs a small online store. We sat down in a cafe, skimmed the menu, ordered quickly. And I tried to use that, choosing from the menu, to explain how AI agents will buy things in her store.

It is hard for a person to understand why agents cannot just choose the way we do.

Here is an example. She only glanced at the page of options, and her body decided before her mind caught up. Too busy, not in the mood, do not want to read. The result: coffee works.

In a fraction of a second, a hundred micro-signals: context, hunger, time of day, weather, fatigue from the previous call. All of it collapsed into one decision in the time it takes to blink.

Humans call this intuition

This is what we call experience, taste, intuition, a trained eye. Years of looking at menus, eating, knowing yourself, reading a situation. By the time the brain explains the choice, the choice is already made.

The AI agent has none of that.

It only has what you wrote on the page.

The uncomfortable part for online stores

This is where it gets uncomfortable for anyone who runs an online store.

A product description that works beautifully for humans, such as “stylish leather sneakers, perfect for the city, in black and white”, tells an agent almost nothing.

Is the leather real or synthetic? Are they for walking all day, or going out to dinner? Do they run small? What is the sole made of? Will they survive a rainy day in Porto? Do they weigh 250 grams or 450?

A human does not ask any of this. They just see. The agent needs an answer to every question in plain text, in data. Otherwise it moves on to the next store that spelled it out better.

This is not an “AI is not smart enough yet” problem

Even a perfect agent that sees every pixel of your photo and reads every word of the description still cannot decide for your customer.

Because your customer is not deciding based on the shoe. They are deciding based on their foot, their wardrobe, their plans, their mood, their girlfriend's opinion, their bank balance at 3pm on a Tuesday. Most of that the agent will not see.

The question every catalogue should answer

The question I keep coming back to is this:

If a customer used to decide with their eyes and their life, and now an agent decides for them using only the facts I wrote down, which facts did I forget to write down?

That is the real challenge. Sit with your catalogue and ask, for every product: what did the human customer understand at a glance, and what did I never put into words?

That is what the agent is missing. And the store where the agent finds the answer is the store where the sale happens.

The second language of the web

I do not think most of us have fully felt how strange this shift is. We spent twenty years learning how to make websites that feel good to people. Now we are being asked to make the same websites understandable to something that has never felt anything.

It is a translation problem between two very different readers. We are only at the beginning of learning the second language.

Back to knowledge base