AI models work in a feedback loop. The fact that you’re asking the question becomes part of the response next time. They could cache it, but the model is worse off for it.
Also, they are Google/Microsoft/OpenAI. They will do it because they can and nobody is stopping them.
This is AI for search, not AI as a chatbot. And in the search context many requests are functionally similar and can have the same response. You can extract a theme to create contextual breadcrumbs that will be effectively the same as other people doing similar things. People looking for Thai food in Los Angeles will generally follow similar patterns and need similar responses, even if it comes in the form of several successive searches framed as sentences with different word ordering and choices.
And none of this is updating the model (at least not in a real-time sense that would require re-running a cached search), it’s all short-term context fed in as additional inputs.
AI models work in a feedback loop. The fact that you’re asking the question becomes part of the response next time. They could cache it, but the model is worse off for it.
Also, they are Google/Microsoft/OpenAI. They will do it because they can and nobody is stopping them.
This is AI for search, not AI as a chatbot. And in the search context many requests are functionally similar and can have the same response. You can extract a theme to create contextual breadcrumbs that will be effectively the same as other people doing similar things. People looking for Thai food in Los Angeles will generally follow similar patterns and need similar responses, even if it comes in the form of several successive searches framed as sentences with different word ordering and choices.
And none of this is updating the model (at least not in a real-time sense that would require re-running a cached search), it’s all short-term context fed in as additional inputs.