NVIDIA Earnings Wrap - Don't Need AI Inference to See Long-Term Strength or Near-Term Datacenter Digestion
A note last updated by
on Nov 19, 2020
NVIDIA reported its fiscal 3Q earnings last night. The highlights:
$4.73 billion revenue, +57% year-over-year
Record revenues in gaming, data center and overall
Data center +8%
Professional visualization +16%
An excel model of $NVDA earnings over the last several years, the full transcript and recent Tweets on the company can be found here:
The NVIDIA story is undoubtedly one of the more attractive situations for exposure to thematics growth trends - data center, machine learning, artificial intelligence, autonomous driving, gaming, augmented reality, virtual reality, video streaming, supercomputing, and others.
GPUs continue to take share in the AI training and inference market from CPUs as CEO Jen-Hsun Huang noted:
Most of the cloud vendors, in fact I believe all of the cloud vendors, use the same infrastructures largely for their internal cloud and external cloud, or have the ability to or largely do. And there's -- the competition, we find to be really good. And the reason for that is this. It just suggests that acceleration -- makes it very clear that acceleration is the right path forward for training and inference. The vast majority of the world's training models are doubling in size every couple of months, and it's one of the reasons where our demand is so great. The second is inference. The vast majority of the world's inference is done on CPUs. And nothing is better than the whole world recognizing that the best way forward is to do inference on accelerators. And when that happens, our accelerator is the most versatile. It's the highest performance. We move the fastest. Our rate of innovation is the fastest because we're also the most dedicated to it. We're the most committed to it, and we have the largest team in the world to it. Our stack is the most advanced, giving us the greatest versatility and performance. And so we see spots of announcement here and there, but they're also our largest customers. And as you know that we're ramping quite nicely at Google, we're ramping quite nicely at Amazon and Microsoft and Alibaba and Oracle and others. And so I think the big takeaway is that -- and the great opportunity for us if you look at the vast amount of workload, AI workload in the world, the vast majority of it today is still on CPUs. And it's very clear now this is going to be an accelerated workload, and we're the best accelerator in the world. And this is going to be a really big growth opportunity for us in the near term. In fact, we believe it's our largest growth opportunity in the near term, and we're in the early innings of it.
There may be some near-term weakness in data center, however, as hyperscalers digest recent build-outs.
Let me make sure we clarify for those also on the call. Yes, we expect our Data Center revenue in total to be down slightly quarter-over-quarter. The computing products, NVIDIA computing products, is expected to grow in the mid-single digits quarter-over-quarter as we continue the NVIDIA AI adoption and particularly as A100 continues to ramp. Our networking, our Mellanox networking, is expected to decline meaningful quarter-over-quarter as sales to that China OEM will not recur in Q4, though we still expect the results to be growth of 30% or more year-over-year. The timing of some of this business therefore shifted from Q4 to Q3. But overall, H2 is quite strong. So in referring to overall digestion, the hyperscale business remains extremely strong. We expect hyperscales to grow quarter-over-quarter in computing products as A100 continues to ramp. The A100 continues to gain adoption not only across those hyperscale customers, but again we're also receiving great momentum in inferencing with the A100 and the T4.
The Mellanox acquisition continues to pay off with growth the last two quarter +/- 75% versus high 20s prior to the acquisition.
it's safe to say that high-speed networking is going to be one of the most important things in cloud data centers as we go forward. And the vast majority of the world's data center is still built for the traditional hyper-converged architecture, which is all moving over to micro services-based disaggregate -- software-defined disaggregated architectures. And that journey is still in its early days. And so I fully expect future cloud data centers, all future data centers are going to be connected with high-speed networking inside. They call it east-west traffic. And all of the traffic will be secured. And so imagine building firewalls into every single server. And imagine every single transaction, every single transmission inside the data center to be high speed and fully encrypted. And so pretty amazing amount of computation is going to have to be installed into future data centers. But that's an accepted requirement now. And I think our networking business, Mellanox, is in the early innings of growth.
There are some cracks in the armor though that should be considered. When stripping away working capital changes, funds from operations less capex was 23.5%, largely as result of much higher Capex (+118% sequentially, +359% y/y).
NVDA FFO Margin (Cash flow before working capital divided by revenue) remains very strong however. For every $1 of revenue, NVDA has been converting into $0.40 +/- of cash recently. If they can continue to hold these margins and grow revenue, the story remains compelling long-term despite lofty valuations (75x FCF, 55x EBITDA).