"Beyond the Musk vs. OpenAI drama, the real signal is the fragility of AI giants. Between ads entering ChatGPT and agents writing their own code, it's time to rethink our architectures."
I look at the $134 billion lawsuit between Musk and OpenAI and see beyond the tech tabloid drama filling the feeds. The figure is shocking, sure, but the real signal here is the structural fragility of how we built these AI giants. For those like me who spend days designing architectures, the lesson is clear: hybrid governance is an operational nightmare that explodes under the pressure of scalability.
I don't care about the gossip, but the stability of the providers on which I base my agents. If Microsoft were forced to pay, this could slow down infrastructure investments or change the terms of service of the APIs I use every day. It's a brutal reminder: building everything on a single vendor is a risk we can no longer take. We need to diversify, and perhaps the answer lies precisely in that real integration of AI I have been preaching for some time, where we don't depend on a single oracle.
It was inevitable: inference costs a fortune. The introduction of ads on ChatGPT in the US marks the end of the "free ride" era. From a technical standpoint, I appreciate the choice to keep ad serving separate from the LLM token flow: polluting the generative output would have been disastrous for the tool's trustability.
The real impact will be seen on corporate workflows. The launch of ChatGPT Go at 8 dollars suggests clear segmentation: pay little or nothing and you are the product, pay the premium and get clean computing power. For us architects, this means having to recalculate the costs of our systems. It's no longer enough to look at the price per token; we must consider the cost of "cleaning" the output data.
The news that made me jump out of my chair, however, is another one: Claude Code writing Claude Cowork. Seeing Anthropic use its own AI to build an internal product in 10 days is the ultimate validation of the method I apply every day. The bottleneck is no longer syntax, it is the clarity of specifications.
I am rethinking my workflows right these days: from now on, I will treat AI as a tireless junior dev to whom I can delegate entire modules. This connects perfectly to the revolution of multi-agent systems and DSLMs (Domain Specific Large Models). Small vertical models talking to each other are infinitely more efficient than a single generalist LLM. This is where I see the real value: moving from "playing with prompts" to engineering robust systems. If you want to delve deeper into how these flows change the game, I analyzed the topic while talking about agentic AI and workflows.
Finally, there is a strong signal from Europe and the open source world. The EU puts 307 million on the table, with a slice dedicated to the Open Internet Stack. I usually ignore bureaucracy, but here there is technical substance: we are talking about solid alternatives to American walled gardens.
In parallel, the open source release of Spirit AI for robotics and LTX-2 for video is a godsend. Having the weights available locally allows me to integrate vision and action into custom workflows, bypassing cloud latency. I already imagine logistics applications where the agent "sees" and corrects errors in real time, bringing automation directly to the edge. It is exactly that pragmatic revolution I was waiting for to get out of chats and act on the physical world.
This week taught us that AI is no longer just a cute chatbot: it is a heavy industry made of lawsuits, infrastructure costs, and complex architectures. For those who want to build seriously, it is time to get your hands dirty with code and strategy. If you are looking for the right tools to start building your infrastructure, take a look at my AI tools list.
This week, the market broke two critical barriers simultaneously: price and latency. AI is shifting from a "magic cost" to a high-efficiency engineering commodity.
This week automation closed the loop, moving from digital to physical via human APIs. It's no longer just about generating text, but total operational orchestration.
The era of "chatting" with AI is over; we have officially entered the era of execution. It is no longer about asking a model to write an email, but overseeing an infrastructure of agents that negotiates, navigates, and builds while we do other things.
AI Audio Version
Listen while driving or coding.

AI Solutions Architect
I don’t just write about AI; I use it to build real value. As an AI Solutions Architect, I design digital ecosystems and autonomous workflows. My mission? To help companies transform slow, manual processes into intelligent, scalable, and high-performance code architectures.