NVIDIA Corporation (NVDA) Credit Suisse 26th Annual Technology Conference (Transcript)

NVIDIA Corporation (NASDAQ:NVDA) Credit Suisse 26th Annual Technology Conference Call November 30, 2022 10:55 AM ET

Company Participants

Colette Kress – Chief Financial Officer

Conference Call Participants

Christopher Caso – Credit Suisse

Christopher Caso

So thanks, everyone. So I’m Chris Caso, the Credit Suisse semiconductor analyst; so welcome back to the conference. Our next speaker will be NVIDIA which you all know. Here from NVIDIA is Colette Kress, Arizona native CFO, coming back home to Arizona.

So Colette, welcome.

Colette Kress

Thank you. Thank you, Chris. I just want to make an opening statement.

First, as a reminder that this presentation contains forward-looking statements and investors are advised to read our report filed with the SEC for information related to risks and uncertainties facing our business. So I thought I’d open up in terms of some key themes and takeaways as we just finished our last quarter and gave guidance for the fourth quarter that we’re in. We are working through our inventory correction that is needed for our gaming business. We are making progress. Q3 definitely showed progress in terms of reducing the inventory that we have out in the channel. That was also an opportunity for us to begin the very beginnings of our Ada launch of gaming as well and that is going quite well. We do believe that as we approach the end of Q4, we will have reached about normalized levels of inventory for our gaming business.

Secondly, theme-wise, we’ve got a great product lineup, new launches of architectures, not only with our gaming business but for our data center business and our continued work in automotive with our Orin architecture. We have Ada out and running. We have Hopper out for our data center business that will begin ramping in next year and we’ll start production in this Q4 with our guidance that we have. So just wanted to open up with some of the key things that we’re focused on right now.

Christopher Caso

Great. Thank you. And maybe starting with some of that. And I think as I’ve spoken to investors in the past few weeks, there does seem to be an appetite for the semiconductor space after the pullback that we’ve had. And I think that a lot of folks are looking at the extent to which things are derisked and making sure no other shoes have dropped. What you said on the call was you talked about that you believe that the end demand in gaming is on the order of $5 billion over a 2-quarter period, so just about $2.5 billion a quarter.

The consensus estimates have your gaming business running at about $1.5 billion a quarter now, so that’s a pretty sizable gap. Does that give you some confidence that, again, the inventory is burning through at some point, we get back to where that steady-state demand is such that even if demand weakens a bit, who knows with China that there are no other shoes to drop in gaming?

Colette Kress

Yes, that’s correct. So we had highlighted at the very onset of when we saw the inventory correction need to help folks understand what we saw in terms of solid sell-through in terms of gaming. So we came up with a view of our normalized gaming of about $5 billion stretched over 2 quarters. Yes, there is always some seasonality. So sometimes, it’s going to be $2.5 billion a quarter, give or take, depending on the quarter leading in. We still see gaming is solid and we’re continuing to watch each and every day in terms of the sell-through that we’re seeing, particularly as we are looking at the Western holidays right now in that time. So we have been under-shipping. We have been under-shipping gaming at this time so that we can correct that inventory that is out in the channel.

And as we discussed, we plan, hopefully, by the end of Q4, as we approach the end of Q4, that we will be in a solid position to that. That means as we move forward, we will start to get back up. We’ll get to some point of an equilibrium between sell-through and sell-in. And then we will likely get to where we’ll be selling in to build the channel for our Ada launch and all of the additional Ada products that we will see coming to market.

Christopher Caso

And following from that, as we look into the April quarter and you haven’t provided guidance and I assume you’re not going to provide guidance right now. But what are the puts and takes that we should think about? You’ve got things like seasonality in the gaming business, you have the normalization of inventory. You have some of the new H100 which starts going into cloud instances starting in the April quarter. So maybe you could speak about some of those things and things we should consider when thinking about the April quarter.

Colette Kress

Correct. We have given our guidance for Q4 and we’re going to start off new year. Our new year is going to be a year focused on our new products coming to market. Ada is just in the early days right now of shipping. We have shipped, one of our very high-end cards are 4090 and our 4080 is just now starting to sell as well. So we have plenty of opportunity with innovation that we’ve brought to market for the gaming. But that’s also true for our data center business as well. Our data center business is now coming forth with our H100 architecture. That’s a great opportunity to really take advantage of a lot of the innovation that we put in to focus on large language models. Transformers is a very big thing of it. And the overall performance improvement, the efficiencies, the total TCO that you get from H100 is there. We will start that beginning production here in Q4. And as we move into the year, we will start that ramp of H100.

As we discussed in terms of our earnings, we will see cloud instances already set up for the H100 as we start the new year as well. So we know it’s a very important product. And we feel that we’re in a great position with the new products coming to market, not just H100 in data center but the networking which is often coming very coupled together with what we’re selling in H100. What’s out there though that we can’t control is the macro conditions, the macro conditions around the world. They’re here in the U.S., they’re definitely here in Europe. And China, we’ve talked about in terms of the economy, really being tough right now.

Christopher Caso

And China is important for you as you go into April given the New Year holiday that occurs there?

Colette Kress

There is a holiday season in the Asia Pac, China area as well. And we’ve been still seeing solid in China. However, with the lockdowns that they’ve been experiencing with COVID, it makes things just a little harder for the distribution for the sale process to happen. So we’ll continue to watch that closely.

Christopher Caso

Maybe you can speak about the Ada launch. And I think there’s been a perception that gaming had been one of the pandemic beneficiaries. We couldn’t go anywhere. And that there would be some impact now that where you have other things to do. To what extent have you seen that? What’s your perception of the success of the Ada launch as compared to the 30 Series, the 20 Series launch? How do you benchmark that?

Colette Kress

When we look at gaming, we’ve always discussed that there may have been a little bit of a pull-in as folks looked for some type of sport during the pandemic. We did see some great new gamers joining that market and taking advantage of the leadership that we have in terms of gaming. Not only is Ada a leader in the industry, so as Ampere. So number 1 and number 2 top architectures are NVIDIA. We continue to watch what we see in terms of Steam. Steam has still a record level of gamers online, gaming. And the top 15 cards on Steam are NVIDIA, all 15 out of 15. So this is an area that gamers will be gamers for life. And we watch this industry both at the great times and also during some of the hard times. It is a sport that is economical because you can choose many different form factors at many different price points for the gaming that you want to do.

Our Ada launch, 4090 when it came out, yes, it was only the very first card and the high-end card, it’s sold out in 2 weeks. So we’re very pleased to see something that was a nod for us, that we knew that gaming, no matter what time of year, no matter what the situation is still a very important entertainment industry.

Christopher Caso

Okay. We’ll pivot over to data center for a bit and the H100. So maybe you can speak to what that means in terms of performance for your customers, what that means in terms of content for NVIDIA.

Colette Kress

Yes. The H100 in terms of its performance improvement, depending on the type of workloads, can be more than 6x what we see with A100. We went through the MLPerf for training the MLPerf benchmarking for inference. And in both of those cases, hands down, everything that we do here at NVIDIA shows through in terms of that performance. What we see in the market and what led us in the design of H100 is the continued advancements of AI, each and every season, each and every year, there is something new. They’re still in that very early days of AI. But large language models and the size of large language models and the influence of natural language processing within that is very front and center.

Not to say that recommendator engines still have a ways to go and very popular with the consumer Internet companies as well to make sure that their marketing is appropriately targeted; so H100 really hones in on that. And when you think about our development of H100, it was done in coordination with the customers, coordination with the enterprises, coordination with the hyperscales so that they are seeing what they would need in there. As we move forward, it’s not about H100 being a chip, it is about being a system, it is about thinking of the process of AI from the very beginning that the data enters the data center to the time that it leaves. And so, you will see a lot of work there, not only with the interconnects, the links, the networking and the memory to all assist in that full process.

Christopher Caso

Yes. I mean it’s a good segue into the next question which is the sustainable advantage you have in AI. And I don’t think there’s much investor question about the growth rate of AI. And at this point, I don’t think there’s much question about NVIDIA’s leadership in that space. That’s why the room is so packed. Going forward, how do you sustain that? I mean what’s the factor? And it has to be more than just the silicon itself. The concerns that have been voiced in the past have been things like start-up companies taking approach, the hyperscalers themselves doing things. So how do you fend off those challenges and retain that advantage?

Colette Kress

So when you think about our products coming to market, we may first discuss in terms of the performance availability. People ask that. They talk about what will we see. But most importantly in there is the software that we’ve built along the way and this is not software that in this last quarter or this last year that we were overall developing. You’re talking about 10 to 15 years’ worth of developing software for this platform and really transforming how the world thinks about computing. The computing architecture of CPU has been here for a lot of time. But to use a GPU, you are really rethinking that application, you’re rethinking the use of that hardware in the GPU.

So our full stack view of software is essential. We’re able to package it up and create a full system package and the AIE that really is out there to help enterprises start their very beginning of work in terms of AI or continuing in terms of workloads that we’re doing. We have SDKs for some of the leading industries and their top applications so that they can begin that work. It’s very hard to think of a start-up or just a new chip having any work in this model if there is no software and somebody has to work to do all that work.

We have more than 3.5 million developers that develop solutions on our hardware, on our software using our full systems. Each and every time that we bring something to market, it’s really about the developers and what they are looking for and how they are going to use us. So I think we’ve been extremely successful of continuing that model equally as much as we focus in terms of on the hardware. We are focusing on the development platform, we’re focusing on the software.

Christopher Caso

Could you give us some help in how we should think about the growth rate of data center over time? And it’s been something that’s been incredibly different — difficult to model. For the most part, I think we’ve all underestimated the growth of it which is a good problem to have. But going forward, you’ve got disparate growth rates within the cloud part of your business which is about half and the enterprise business, the other half. And content ASPs are rising as well.

So I think the way a lot of folks have done is taken a look and figured some multiple on cloud growth but it’s not really a very accurate way of looking at it. And I don’t know if you have another way of how NVIDIA looks at it internally.

Colette Kress

So as we discussed in our earnings, we have talked about our business in data center and we’ve broken it out with talking about our hyperscales and then talking about our vertical industries that we’re seeing. From each in quarter, you probably about have 50%, 50% in there. We’re going to work on providing a little bit more of a disclosure going forward. Our hyperscales are important. They’re large companies but many of them are also cloud companies in terms of serving cloud instances for the enterprises. So if we can take this time to look at what we’re seeing also with the cloud, that gives you a better understanding of really the vast nature of enterprises that are using AI and accelerated computing as we go forward.

So stay tuned for us as we move forward on that. What we see with the hyperscales is just that, setting up cloud instances on a need for many different types of industries enterprise-wise. They do it also for researchers. They do it also for higher education. And what we are seeing over this period of time is that cloud is continuing to be a very important part. An important part, both historically and likely going forward. As when you think about the challenges of the economy, setting up a cloud instance to get started is an easier path than working through a significant amount of capital to build out your overall data center. It’s also becoming more difficult for enterprises to have real size of IT organizations that can master some of these very complex configurations that we’re seeing in data centers today.

So the cloud is going to be very important. When you think about some of the announcements that we made with Q3, talking about our work with a hyperscale such as Azure, such as Oracle, selling with them to the enterprises. This is a real good breakthrough of what we’re seeing the importance of our cloud providers but then also realizing the importance of our relationships with the enterprises as well as our software stack. So now they have the ability to market the software stack, create instances that an enterprise could get, whether they built it on their own or put it in the cloud. So we’re really pleased with this in terms of the partnership and working with our cloud providers here.

Christopher Caso

Does that mean say that over time now that perhaps we should see the cloud part of the business grow to be a bigger part of your business as maybe we see some migration from on-premise to cloud? Is that just a more efficient way of dealing with workload?

Colette Kress

It’s tough to say what the different growth rates will be because you’ll still have many that will desire to be on-prem. But stepping back from it, we’re indifferent, right? We’re indifferent of whether or not a consumer, an enterprise wants to build on-prem. They want to use a cloud instance, whether what the hyperscale wants to do. That’s what we’re here for. We have many different ways in which you can use AI, use accelerated computing and we have now set up that throughout the world. So we’re indifferent. But yes, cloud will be a very important industry going forward.

Christopher Caso

I also want to speak about one of the other product cycles you’re expecting this year in Grace, the CPU product that gets added there. Maybe you could speak to in terms of timing, I think in our model, we’re expecting some moderate impact in the second half of this year. Speak to what that means for your customer in terms of performance, what it means for NVIDIA in terms of content.

Colette Kress

Yes. So Grace, Grace is our first CPU, our first Arm Grace CPU coming to market. We plan to start the testing of that, bringing that to sampling here in the first half. Second half, we’ll talk about it being into market. What Grace is, Grace works together with our Hopper architecture. It is geared toward very large models, very high-end type of AI. It’s not a general-purpose CPU; it has a specifics to focus on accelerated and AI together.

So we are excited to bring it to market. We have worked to really understand the importance that, that CPU will have on the workload of AI and we’ll see how we do in the second half of the year.

Christopher Caso

How do you capture value on that? Because obviously, if a customer buys a board with Grace on it, that removes the need for an accompanying Intel or AMD server which the cost of that platform is pretty close to what would be an A100 today. Is NVIDIA in a position to capture all that value because you’re essentially given the customer better performance and you actually remove some additional costs from them?

Colette Kress

When we are developing all of our systems, it is about a total TCO when they have moved to accelerated computing. There isn’t any one of our architectures in data center that is not improving a TCO. Enabling now a DGX, incorporating Grace and Hopper together is just a better TCO for that customer, that you have a very specific CPU for the workload that we’re trying to do. But it’s not uncommon though for us to also use our peers CPUs as well. So those have been within our DGX and that is also beneficial to us just as long as we’re moving to accelerated computing.

Christopher Caso

Right. But it’s safe to say, there’s some content that you have to gain on that as you’re adding higher performance. NVIDIA is a company generally that prices the value. So if you drive performance which is in all your markets, your ASPs have been going up.

Colette Kress

Yes. We price for the innovation and the savings of that TCO. That’s really underlying whether that be gaming, whether that be data centers, that’s what we’re trying to do.

Christopher Caso

Fair enough. With some of the macro concerns, I mean, several and some of our colleagues at Credit Suisse have some current concerns about cloud CapEx in 2023. Obviously, there’s some pressure in enterprises. Does that moderate your view of the H100 or Grace product cycle as you go into 2023? Is that something we should take a little more moderate expectation and expect more in 2024? How do we think about it?

Colette Kress

No. We’ve studied and we’re continuing to watch the macroeconomic conditions. One of the things that you see during these times as enterprises will really study and work through what they’re going to purchase. Some of that sometimes is a lengthening of the time to overall close deals. That may happen and there are, although certain industries that AI and accelerated is so important to the work that they do, important to their monetization of their platform, important in terms of new growth for them.

So focusing on accelerated computing and AI, whether it be sustainability trying to think through the use of energy, it’s a more efficient platform. And the leading edge of AI will be important for so many businesses. So we’re in a good position for that with both the new architecture as well as being a leader in that industry. The macroeconomic conditions were there though. So we are watching it carefully but we know that the cloud will be essential to keep going during this time.

Christopher Caso

Okay. I’ll pivot to something else which NVIDIA has spoken about pretty consistently over the last year. I don’t think it’s as high of an investor focus but it’s software. And I think software is interesting because when you add software economics to a semiconductor company model, it could be pretty important. How important do you expect software to be on the financials as you go over the next 18, 24 months?

Colette Kress

Yes. So we have started selling our software separately. For example, our data center products or gaming products, all of them have had software since the very beginning of the time. But what we’ve done is we’ve packaged up key software in probably 3 different areas: one, NVIDIA AI Enterprise which is a set of system software that helps every enterprise get started and essentially treat this, what they see with many of their other enterprise software capabilities, being able to monitor jobs, being able to manage the whole data center as a whole and essentially get started on much of that workloads. That’s very difficult for enterprises to start, they wouldn’t necessarily have the teams on board, the engineers on board to do that. Essentially, this is also an area where enterprises want that software because with applying that to some of their mission-critical applications and work, they want to know that we are there to both support, update and keep all of that software current. So there is a double win in terms of that. So that’s our first piece.

The second one is our Omniverse. Our Omniverse is an important area, thinking about the 3D virtual world and the simulations of much of the builds design — digital twin environment that’s out there. There’s many different use cases. So this is an area where we can sell in a pod with our enterprise customers or even in an individual basis. Our third piece is focusing on automotive. As we’ve been working on automotive in terms of AV, we have agreements where we will be selling software along with that hardware and maintaining that software over the life of the car. So this can be a meaningful piece in these 3 areas for us. We’re just getting started in working and selling it. You’ll even see with H100 and certain forms of H100, it will come with that software coupled together with the H100. We’re excited to see this. This is a driver not only of top line but also an improvement can be in terms of our gross margin for the long term. So we’ll see this grow.

Christopher Caso

What’s the chance that we get to see software kind of broken out with a bit more granularity in the near future? Is that something you’ve contemplated?

Colette Kress

We will be giving you information on some of our key customers in software and key milestones that we reach in terms of software. We think that would be the most beneficial to help you and walk us through this journey with us.

Christopher Caso

Okay. Maybe I’ll check to see, it’s a packed room, if there’s a question that somebody like to ask. Anyway, I’ll continue as you think about it. Automotive as well. And automotive, there’s been a transition from the infotainment part of the business which has trended down and we’ve been waiting for ADAS full self-driving to have an impact now. It sounds like we’re kind of on the edge of that beginning to occur. So maybe you could speak to that a bit.

Colette Kress

Correct. What you’re seeing today, our automotive business has some very solid growth right now. Our Orin architecture came to market at a very important time where there’s 2 important things in the automotive industry: EV or electrical vehicles as well as AV. Now when you are dealing with your EV customers, they’re really thinking through their complete redesign of a car from the bottoms up. They’re building a car from scratch. An important piece of that is that computing platform. Orin is the exact great product for that to do from soup to nuts. So we have them working not only on consolidating all that different software into one computing platform but also adding AV features to the work that we are seeing. So NEVs across the world, particularly in the Asia Pac and China area, have really launched on to our Orin architecture and they’re using that.

We also have now agreements with very large companies, both Daimler, JLR and futures to really provide them not only the hardware but a full software stack. Those will be coming in the next couple of years. They will start in terms of preproduction, they’ll start in terms of sampling in a little bit more than a year. And then we’ll see in terms of volume after that. So the automotive has those 2 things occurring at the same time.

Christopher Caso

So really and I guess the way to think about it is 2023, kind of going to ’24 as more the NEV and then ’24, ’25 to the larger OEMs, is that right way to think about it?

Colette Kress

We believe that’s the general direction that it will go.

Christopher Caso

And will the NEV — will it be material in that — and you said the auto business, there’s been strong growth. But the high-class problem you’ve had is you’ve had strong growth in a lot of different areas, so it’s kind of stayed as a smaller percentage of revenue. Once we get to those OEM, the larger like Daimler, ’24, ’25, should we expect that auto grow as a more significant portion of NVIDIA revenue at that point?

Colette Kress

That’s correct. When we look at our fiscal year ’25, we do believe there will be a significant amount in there. Keep in mind, our automotive customers, not only what they have within their car is important of what they’re purchasing from NVIDIA but they’re also a very important data center customer. In order to build what they want for AV of the future, they have our computing back in their data centers. It’s a very important part of that. They’ve also been working, of course, with our workstations since the beginning of time. So it’s interesting to see. Yes, we have what is within the automotive car itself. But keep in mind, we also have a great business with them through the data center.

Christopher Caso

Okay. We’re almost out of time and I’ll end on a question on margins. And I guess, in NVIDIA’s case being fabless with so many products, your margins have been fairly stable. And is there anything that we should think of with regard to the growth rates of different businesses or your spending itself that we should consider with regard to gross margins or operating margins?

Colette Kress

So looking at our gross margins, we continue to see not only the manufacturing cost but remember, we have so much engineering that’s working on the software. Thus, the value that we distribute in terms of the innovation has allowed us to have very solid gross margins. In the last year or a couple of years, you’ve certainly seen the cost of manufacturing, the cost of components, contract manufacturing continue to increase. We’ve absorbed a lot of that and still had very, very solid gross margins. We take that opportunity each time that we look at a new architecture, on the value that we’re providing but we also have to take into account the cost that it takes to do some of this very, very high-end manufacturing that we do. So those are the important pieces.

Now when you look at our overall operating margins, it’s another case of very strong operating margins. During this time, it is always an opportunity for us to think through how we spend our money but we’re quite an efficient company. We’re only 25,000 people large for a company that’s operating well into double-digit billions in there. But we took this time to slow down our hiring and focus on the engineers, focus on the teams that we had. And I think the teams rallied around that but also continuing to look to make sure we are spending our money wisely on that. So that will be our focus, focused not only on gross margin for the long term but also focusing on our operating margins as well.

Christopher Caso

Well, that’s great. It looks like we’re out of time. But Colette, thanks for your time.

Colette Kress

Thank you, Chris.

Christopher Caso

Thanks, everyone.

Question-and-Answer Session

End of Q&A

Be the first to comment

Leave a Reply

Your email address will not be published.


*