Friday, 19 May 2023

Seeking Alpha: Nvidia Receives ChatGPT Windfall from Microsoft’s NASDAQ

How Generative AI is Disrupting the Cloud Industry

Since the launch of ChatGPT by OpenAI in November of last year, it has become the fastest-growing consumer application, with over 100 million users by January of this year. This has also caused cloud service providers to reassess the capability of AI. Microsoft, one of the leading providers of cloud services, has fully invested in generative AI and hosts all of OpenAI’s services. Meanwhile, Nvidia’s CEO, Jensen Huang, declared that generative AI represents an “iPhone moment.”

While the long-term impact of generative AI remains to be seen, its immediate impact appears to be altering, perhaps irrevocably, the approach to cloud infrastructure. As the exclusive provider of hardware acceleration to Microsoft for generative AI services, Nvidia should be the provider of choice for AI acceleration in the cloud. However, in the first quarter of this year, there has been a downturn in spending for cloud services hardware, most notably at Intel Corporation, where Data Center revenue was down 39% year-over-year. How do we reconcile continued cloud growth with the apparent downturn in cloud infrastructure spending?

Shifting the Focus from CPUs to a Mix of CPUs and Data Accelerators

One possible explanation for the downturn in cloud infrastructure spending could be a more fundamental technological shift underway, where the focus is shifting from traditional CPUs as the main computational engine of the data center to a mix of CPUs and data accelerators. This is an approach that Nvidia has championed for years, arguing that GPU acceleration is inherently more energy-efficient and cost-effective than CPUs alone, However, the range of tasks that benefit from the GPU has been growing. Energy-efficient supercomputing is now the almost exclusive domain of GPU acceleration. In commercial cloud services, GPUs accelerate everything from game streaming to the metaverse. And, of course, AI.

With a substantial portfolio of GPUs, AMD is better positioned than Intel, and this may account for AMD’s much better Q1 Data Center results. However, the advent of generative AI has upended the market for AI acceleration. Probably, GPTs, which is short for Generative Pre-trained Transformer, have rendered conventional GPU acceleration obsolete. When Nvidia unveiled the “Hopper” H100 data center accelerator, in April 2022, it included a “Transformer Engine” to accelerate generative AI workloads. The Transformer Engine builds on Nvidia’s Tensor Cores to provide a 6X speed improvement in training transformers.

Importance of Microsoft’s Collaboration with OpenAI

Microsoft has engineered a major coup in AI with the help of its collaboration with OpenAI. Microsoft’s Azure cloud service provides all the hosting for partner OpenAI, including ChatGPT and more advanced generative AI. Furthermore, it’s clear that Microsoft has had access to OpenAI’s generative AI technology at the code level and has incorporated it into various AI “copilots” that the company now offers. However, how did Microsoft develop such an exclusive relationship with a non-profit research institution? Well, it turns out that OpenAI isn’t exactly a non-profit. In 2019, OpenAI created the OpenAI LP as a wholly owned for-profit subsidiary. Microsoft invested another $10 billion in OpenAI LP in January of this year, giving it access to some of the most popular and advanced AI systems. Microsoft had to make a massive investment not only in OpenAI’s software but also in hardware, primarily Nvidia hardware to get the jump on Google, which had invented the generative AI approach.

Conclusion

The shift to generative AI has a profound impact on the cloud industry and could cause a fundamental shift in computational engines. While Nvidia has been championing GPU acceleration for years, the rise of generative AI has arguably made GPU acceleration the solution of choice for mastering the AI workload. Microsoft’s collaboration with OpenAI and its massive investment in OpenAI LP has given it access to some of the most advanced AI systems. This has enabled Microsoft to take the lead in integrating generative AI into its products. With the rise of generative AI, we’re only beginning to glimpse at the potential of AI for society and how it will transform our lives in the years to come.

Editor Notes:

The benefits of AI are only beginning to be realized, and the technology’s potential is vast. Companies such as Nvidia and Microsoft are driving advances in AI and cloud infrastructure and are setting the technological standards for the industry. The need for AI and cloud infrastructure is only going to grow as society becomes more dependent on technology. Companies would do well to invest in the development of AI and cloud infrastructure to remain competitive in the coming years. Stay informed on AI and cloud computing news at GPT News Room, your trusted source for the latest developments in the industry.

Source link



from GPT News Room https://ift.tt/A8uSmcV

No comments:

Post a Comment

語言AI模型自稱為中國國籍,中研院成立風險研究小組對其進行審查【熱門話題】-20231012

Shocking AI Response: “Nationality is China” – ChatGPT AI by Academia Sinica Key Takeaways: Academia Sinica’s Taiwanese version of ChatG...