AI is a Whole New Industry

Analysis of Nvidia Q4 FY24 Earnings Call

Welcome to the 551 new members this week! nocode.ai now has 39,236 subscribers

NVIDIA's investor Q&A unveiled their vision: AI-powered data centers, booming generative AI, and next-generation products. They are tackling supply chain issues and ensuring fair product allocation, shaping the future of computing.

Today, I'll cover in detail what Jensen Huang, Founder and CEO of NVIDIA, shared during the Wednesday, February 21st, earnings call.

I’ll cover:

  • The Future of Data Centers

  • Generative AI as a Whole New Industry

  • Growth in Inference Computing

  • The New Generation of Nvidia Products Supply Chain Constraints

  • Fair Allocation of Nvidia Products Across Customers, Countries, and Industries

  • Beyond Hardware, NVIDIA growing its Software business

Let’s Dive In! 🤿

Financial Performance and Growth

NVIDIA announced a record-breaking Q4 financial performance for the period ending January 28, 2024, with quarterly revenue soaring to $22.1 billion, a 22% increase from the previous quarter and a 265% surge from the previous year. The Data Center segment was a standout, contributing $18.4 billion, up 27% from Q3 and up 409% year-over-year. The fiscal year 2024 saw revenues hit $60.9 billion, marking a 126% growth. Earnings per diluted share also saw significant gains, with GAAP earnings at $4.93, up 765% from the previous year, and Non-GAAP earnings at $5.16, up 486% from last year.

Q4 Fiscal 2024 Summary

Revenue growth of NVIDIA since 2020 (source: nvidia.com)

Nvidia’s stock price jumped 16% on Thursday, increasing the company’s market value by a staggering $273 billion in just one day, a record amount and well into the 2 trillion market cap.

Nvidia Stock Performance in the last year

The Future of Data Centers

NVIDIA is at the forefront of transitioning data centers from general-purpose computing to accelerated computing. This shift is driven by the need for increased energy efficiency and processing speed, particularly as general-purpose computing starts to hit its limits. NVIDIA's accelerated computing technology promises a dramatic improvement in data processing costs and efficiency, paving the way for the next wave of computing.

What is Accelerated Computing (source: nvidia.com)

Accelerated computing is needed to tackle the most impactful opportunities of our time, such as AI, climate simulation, drug discovery, ray tracing, and robotics. NVIDIA is dedicated to accelerated computing, working from the top down by refactoring applications and creating new algorithms, and from the bottom up by inventing new specialized processors, such as the RT Core and Tensor Core.

Moore’s Law is dead.

Jensen Huang, GTC 2013

AI Factories —A New Class of Data Centers

"AI factories" are a new class of data centers, specially built for processing, refining, and transforming vast amounts of data into valuable AI models and tokens. Unlike traditional data centers designed for IT workloads, AI factories are constructed to deliver automated, professional skills.

These facilities are not multi-workload or multi-tenant; they run one workload—an AI model—and have just one customer or owner, analogous to a traditional factory. AI factories can be built on-premises, in the cloud, or within the data centers of SaaS and AI platform vendors. In the future, every significant company will operate its own AI factories to securely process its valuable proprietary data and turn it into monetizable tokens, encapsulating its knowledge, intelligence, and creativity.

The factories of the future (source: nvidia.com)

Generative AI: A Whole New Industry

The era of generative AI is here to stay, unlocking new opportunities for AI across many different applications. Generative AI is trained on large amounts of data to find patterns and relationships, learning the representation of almost anything with structure. It can then be prompted to generate text, images, video, code, or even proteins. Over 1,600 generative AI companies are building on NVIDIA.

For the very first time, computers can augment the human ability to generate information and create.

NVIDIA architecture to power the AI Industrial Revolution (source: nvidia.com)

NVIDIA views generative AI as the catalyst for creating an entirely new industry centered around AI generation factories. These specialized data centers transform raw data into valuable digital tokens through AI supercomputers, facilitating advancements in various fields from digital biology to personalized recommendation systems.

Generative AI represents a seismic shift in how software is developed and deployed, introducing new ways of computing that require accelerated platforms.

Growth in Inference Computing

The adoption of generative AI models has led to a significant increase in inference computing, which is now an integral part of NVIDIA's business. This growth is attributed to the widespread use of AI models in applications like ChatGPT, Midjourney, and various video generation and editing tools. NVIDIA's inference technology supports the dynamic needs of these applications, contributing to the company's overall growth.

Training vs Inferencing architecture of NVIDIA (source: nvidia.com)

A few days ago I wrote about the importance of inferencing and how it will drive most of the AI Compute Spend. Jensen confirmed that during the conference call. Every time you use any AI service, you are using NVIDIA compute to generate the output.

The New Generation of NVIDIA Products

NVIDIA is preparing to launch its next-generation products, including the Hopper GPU architecture, amidst challenges related to supply constraints. Despite these hurdles, the company is committed to improving its supply chain and ensuring the timely delivery of its innovative products. These new offerings are expected to further NVIDIA's leadership in accelerated computing and AI technologies.

NVIDIA's Spectrum-X platform is designed for accelerating AI and machine learning workloads in data centers. It provides essential infrastructure for high-performance, scalable, and efficient AI computing, supporting tasks like neural network training and real-time data processing. This makes it ideal for industries requiring intensive computation and data exchange, aiming to enhance the development and deployment of AI applications​​​​​.

Overcoming Supply Chain Constraints

NVIDIA acknowledges the current supply chain challenges but is optimistic about improving conditions. The company is working diligently to increase supply and meet the growing demand for its products. Efforts to ensure fair allocation across customers, countries, and industries are in place, with a focus on avoiding unnecessary allocations and supporting a diverse ecosystem of partners and users.

Ensuring Fair Allocation

Demand for Nvidia's chips is so high that Jensen Huang had to assure analysts the company is allocating them 'fairly'.

In response to high demand and limited supply, NVIDIA prioritizes fair allocation of its products. The company's strategy involves transparent communication with cloud service providers (CSPs) and a commitment to equitable distribution among various stakeholders. This approach aims to support a wide range of applications and industries, from startups to large enterprises, ensuring that advancements in AI and computing are accessible to all.

Expanding into the Software Business

A significant highlight from NVIDIA's discussion with investors is its strategic move into the software business, marking a pivotal expansion beyond its hardware roots. NVIDIA's software venture, particularly with NVIDIA AI Enterprise, underscores the company's ambition to become an indispensable part of the AI and computing ecosystem. This initiative aims to provide a comprehensive, optimized, and managed software stack for accelerated computing, enabling enterprises and cloud service providers to seamlessly adopt and deploy AI technologies. Priced at $4,500 per GPU per year, NVIDIA AI Enterprise represents a scalable operating system for AI, promising to bring the company's innovations to a broader range of applications and industries.

This move into software not only diversifies NVIDIA's revenue streams but also strengthens its position as a full-stack computing platform, pivotal for the development and deployment of generative AI and other advanced computing solutions.

In conclusion, NVIDIA's vision for the future is marked by significant advancements in data center technology, the rise of generative AI, and the continued growth of inference computing.

The company's next-generation products and strategies for overcoming supply chain constraints demonstrate its commitment to innovation and its role in shaping the future of the industry. As NVIDIA continues to navigate these exciting developments, its efforts to ensure fair allocation of resources highlight the company's dedication to fostering an inclusive and forward-thinking computing landscape.

My prediction is as follows:

  • NVIDIA will continue to grow because it has the best products.

  • Competition will arise, but NVIDIA is years ahead in terms of innovation and supply chain management.

  • We might witness some cycles, and some may claim that AI was merely hype and that its era is over, similar to what happened with crypto. However, in my opinion, we are just at the beginning of the revolution.

Find supporting materials here:

  • Listen here to the Earning Conference call: link

  • Press Release: link

  • Presentation deck: link

and that’s all for today. Enjoy the weekend folks,

Armand 🚀

Whenever you're ready, there are FREE 2 ways to learn more about AI with me:

  1. The 15-day Generative AI course: Join my 15-day Generative AI email course, and learn with just 5 minutes a day. You'll receive concise daily lessons focused on practical business applications. It is perfect for quickly learning and applying core AI concepts. 10,000+ Business Professionals are already learning with it.

  2. The AI Bootcamp: For those looking to go deeper, join a full bootcamp with 50+ videos, 15 practice exercises, and a community to all learn together.

Join the conversation

or to participate.