Almost forgotten salt mines pops up radar again, what is this sorcery? I dont recall butting heads with you, perhaps it was not a memorable experience?

.
Anyway, here goes. It seems to me that we are saturating the limits of the LLM architecture, new models are incremental improvements, no where near the exponential growth we saw a couple of years ago, AGI seems to red shift not blue.
Agentic workflows are great. When they get it right. They dont. And with the ceiling issue described above, it doesnt seem like everyone is going out of work anytime soon.
So while AI is still very much in business, the exponential potential super future that is currently priced in ... I think that bubble is going to burst.
Still even if the models hit a hard ceiling *right now* there is still so much untapped potential in those here right now that it isnt going away.
Personally I do agentic coding with cursor, it's great, I am having a daily talk withmyself to not buy a GB10 based system to tinker with at home. I want this thing to continue into the sky ... I just dont see it happening right now.
They may break out a new architecture tomorrow or a new learning paradigm and we are off to the races again. Maybe.