Last month, OpenAI announced a strategic partnership with Nvidia, which enables the AI powerhouse to deploy 10 gigawatts of datacenters based on Nvidia’s formidable GPU hardware.
This month, OpenAI unveiled its strategic alliance with AMD, in which the former will deploy 6 gigawatts of the latter’s AMD Instinct GPUs. The agreement also includes AMD stock options for OpenAI, underscoring the close relationship between the pair of organizations.
For industry insiders, these developments have not gone unnoticed and may presage a monopoly in the valuable compute resources necessary to innovate, improve, and operationalize the AI models that are rapidly becoming ubiquitous in today’s society.
According to Assaf Melochna, president and co-founder of Aquant, “If one company, or a few, controls the bulk of global compute, they effectively control who gets to innovate and at what cost. That’s a compute monopoly and it could define the next decade of AI power dynamics.”
With other major compute manufacturers, such as Intel (which, Melochna pointed out, Nvidia recently invested in) still operating outside the direct auspices of OpenAI, it may be a tad early to predict a monopoly just yet.
What’s clear, however, is what these recent partnerships reveal about the competitive strategy of OpenAI, as well as those of other notable entities in the AI race, including Meta and the Chinese national government.
Compute scarcity
The enormity of the Large Language Models and frontier models that are at the fore of avant-garde applications of AI naturally tax the compute resources required to create and implement such models in production. Much of the analysis of OpenAI’s recent partnerships with Intel and AMD, both of which specialize in compute resources, is predicated on the fact that “there’s a scarcity of computing power,” Melochna said. Consequently, OpenAI’s interests in two of the largest compute manufacturers is far from academic and possibly indicative of its strategy to distance itself from other AI companies producing the aforementioned models.
According to Melochna, the company is deliberately “buying the future computing power and that will allow them to have control over all the new AI in the U.S.” Although both AMD and Intel can technically still service other customers, some pundits are concerned that there simply won’t be enough compute resources to go around if these manufacturers prioritize OpenAI. “The competition is not about who has the best model,” Melochna explained. “If I’m not enabling other people to build other models to compete with me, I’m going to control the data.”
The open source route
While OpenAI appears to be focusing on acquiring as much of today’s compute resources as it can, other players in the AI space are implementing alternative strategies. The strategy of China—which made significant headlines in this space when the Chinese released some of their DeepSeek models as open source—appears to be invoking the opposite approach of OpenAI’s. According to Melochna, the Chinese government is subsidizing efforts to construct smaller AI models that are open source—yet potentially as influential as some of OpenAI’s larger ones.
“This has created a lot of development over the last few months about how computing can be done in China with a smaller amount of computing power,” Melochna commented. “One can say this is how you destroy a monopoly; you give other people the ability to develop alternatives to it.”
Accessing models via the global open source community is one of the surest ways to democratize the development and use of AI models, particularly if they don’t require the vast amounts of compute resources OpenAI has at its disposal. “The best thing is that if that computing power will come and be distributed to everyone and we will be able to learn and to build open source for AI in the US,” Melochna observed.
No easy answers
There are other strategies in the AI race other than acquiring a considerable amount of contemporary compute resources or focusing on building smaller open source models. Meta has been hiring several prominent researchers in this space and offering them exorbitant bonuses. Many perceive this aggressive talent acquisition strategy as foundational to its approach to surge ahead in this space.
Ultimately, time will tell how successful any of these endeavors will be. What’s less equivocal is the immediate effects of these organizations dedicating such energy and effort to these strategic initiatives. “It’s become a competition of giants,” Melochna said. Such a degree of competitiveness is a testament to the overall influence of AI models in contemporary society.

