New Delhi: US chip designer and computing firm, Nvidia on Wednesday announced a collaboration with Microsoft to build computers "at scale" to handle intensive artificial intelligence computing work in the cloud. The AI computer will run on Microsoft's Azure cloud, with tens of thousands of GPUs, including Nvidia's most powerful H100 and A100 chips. Nvidia declined to comment on the value of the deal, but industry sources said each A100 chip is expected to cost between $10,000 (roughly Rs. 8,15,400) and $12,000 (roughly Rs. 9,78,500) each, and the H100 much more is costly. Also Read: Oppo Smartphones now support Jio 5G with new updates “We are at the tipping point where AI is coming into the enterprise and getting into the services that customers can use to deploy AI for business use cases,” said Ian Buck, general manager of Hyperscale and HPC at Nvidia. can do." "We're seeing a broad-based adoption of AI ... and the need to apply AI to enterprise use cases." In addition to selling chips to Microsoft, Nvidia said it would collaborate with the software and cloud giant to develop AI models. Buck also said that Nvidia will be a customer of Microsoft's AI cloud computer, developing AI applications on it to provide services to customers. Also Read: Vivo Y01A Is Debuted With MediaTek Helio P35 SoC The rapid development of AI models, such as those used for natural language processing, has greatly increased the demand for faster, more powerful computing infrastructure. Nvidia announced that Azure will be the first public cloud to use its Quantum-2 InfiniBand networking technology, which has speeds of up to 400 gigabits per second. Also Read: Samsung Galaxy M04 is anticipated to release soon This networking technology connects servers at breakneck speed. This is important because heavy AI computing requires thousands of chips to cooperate across multiple servers.