NVIDIA 2Q17 Earnings Call Notes

Jen-Hsun Huang – NVIDIA Corp.

GPU acceleration of servers delivers extraordinary value proposition

“Data center is a very large market, as you know, and the reason for that is because the vast majority of the world’s future computing will be largely done in data centers. And there’s a very well accepted notion now that GPU acceleration of servers delivers extraordinary value proposition. If you have a data-intensive application, and the vast majority of the future applications in data centers will be data intensive, a GPU could reduce the number of servers you require or increase the amount of throughput pretty substantially. Just adding one GPU to a server could reduce several hundred thousand dollars of reduction in number of servers. And so the value proposition and the cost savings of using GPUs is quite extraordinary.”

Cryptocurrency is here to stay

“Cryptocurrency and blockchain is here to stay. The market need for it is going to grow, and over time it will become quite large. It is very clear that new currencies will come to market, and it’s very clear that the GPU is just fantastic at cryptography. And as these new algorithms are being developed, the GPU is really quite ideal for it. And so this is a market that is not likely to go away anytime soon, and the only thing that we can probably expect is that there will be more currencies to come.”

Costs close to $1000 to manufacture top of the line GPU

“And so the price of Volta is driven by the fact that, of course, the manufacturing cost is quite extraordinary. These are expensive things to go and design. The manufacturing cost itself, you guys can estimate it, is probably in the several hundred dollars to close to $1,000. However, the software intensity of developing Volta, the architectural intensity of developing Volta, all of the software intensity associated with all the algorithms and optimizing all the algorithms of Volta is really where the value-add ultimately ends up. And so I guess the pricing – your question relates is pricing. We expect pricing to be quite favorable for Volta.”

Going to see lvl 4 self driving cars in 2021

“And then the fully autonomous drivered cars, driven cars, branded cars will start hitting the road around 2020 and 2021. So the way to think about it is this year and next is really about development. Starting next year and the following year is robot taxis. And then 2021 to forward you’re going to see a lot of Level 4s.”

Volta is a giant leap

” Volta was a giant leap. It’s got 120 teraflops. Another way to think about that is eight of them in one node is essentially one petaflops, which puts it among the top 20 fastest supercomputers on the planet. And the entire world’s top 500 supercomputers are only 700 petaflops. And with eight Voltas in one box, we’re doing artificial intelligence that represents one of them. So Volta is just a gigantic leap for deep learning and it’s such a gigantic leap for processing that – and we announced it at GTC, if you recall, which is practically right at the beginning of the quarter.”

Neural net hardware and software is improving at an exponential rate

“A neural net in terms of complexity is approximately – not quite, but approximately doubling every year. And this is one of the exciting things about artificial intelligence. In no time in my history of looking at computers in the last 35 years have we ever seen a double exponential where the GPU computing model, our GPUs are essentially increasing in performance by approximately three times each year. In order to be 100 times in just four years, we have to increase overall system performance by a factor of three, by over a factor of three every year.

And yet on the other hand, on top of it, the neural network architecture and the algorithms that are being developed are improving in accuracy by about twice each year. And so object recognition accuracy is improving by twice each year, or the error rate is decreasing by half each year. And speech recognition is improving by a factor of two each year. And so you’ve got these two exponentials that are happening, and it’s pretty exciting. That’s one of the reasons why AI is moving so fast.”

The whole car is going to become an AI

“The second major component is our self-driving car platforms, and a lot of it still is infotainment systems. Our infotainment system is going to evolve into an AI cockpit product line. We initially started with autonomous driving. But you probably heard me say at GTC that our future infotainment systems will basically turn your cockpit or turn your car into an AI. So your whole car will become an AI. It will talk to you. It will know where you are. It knows who’s in the cabin. And if there are potential things to be concerned about around the car, it might even just tell you in natural language. And so the entire car will become an AI.”

The next revolution of AI will be at the edge and the most visible impact will be in the autonomous vehicle

“The next revolution of AI will be at the edge, and the most visible impactful evidence will be the autonomous vehicle. Our strategy is to build a ground-up deep learning platform for self-driving cars, and that has put us in pole position to lead the charge”

NVIDIA 1Q17 Earnings Call Notes

Jen-Hsun Huang – NVIDIA Corp.

Three pillars of the cloud

“There are three major pillars of computing up in the cloud or in large data centers, hyperscale. One pillar is just internal use of computing systems, for developing, for training, for advancing artificial intelligence. That’s a high-performance computing problem.”

“the second pillar is inferencing. And inferencing as it turns out is far, far less complicated than training. It’s a trillion times less complicated, a billion times. It’s a trillion times less complicated. And so once the network is trained, it can be deployed. And there are thousands of networks that are going to be running inside these hyperscale data centers, thousands of different networks, not one, thousands of different types. And they’re detecting all kinds of different things. They’re inferring all kinds of different things, classifying, predicting, all kinds of different things, whether it’s photo or voice or videos or searches or whatnot.”

“And then the last pillar is cloud service providers, and that’s basically the outward public cloud provisioning a computing approach. It’s not about provisioning inferencing. It’s not about provisioning GPUs. It’s really provisioning a computing platform.”

*AI is going to eat software

“AI is going to infuse all of software. AI is going to eat software. Whereas Marc [Andreessen] said that software is going to eat the world, AI is going to eat software, and it’s going to be in every aspect of software. Every single software developer has to learn deep learning. Every single software developer has to apply machine learning. Every software developer will have to learn AI. Every single company will use AI. AI is the automation of automation, and it will likely be the transmission. We’re going to for the first time see the transmission of automation the way we’re seeing the transmission and wireless broadcast of information for the very first time. I’m going to be able to send you automation, send you a little automation by email.”

The impact of AI is huge and we’re just getting started

“And so the ability for AI to transform industry is well understood now. It’s really about automation of everything, and the implication of it is quite large. We’ve been using now deep learning – we’ve been in the area of deep learning for about six years. And the rest of the world has been focused on deep learning for about somewhere between one to two, and some of them are just learning about it.

And almost no companies today use AI in a large way. So on the one hand, we know now that the technology is of extreme value, and we’re getting a better understanding of how to apply it. On the other hand, no industry uses it at the moment. The automotive industry is in the process of being revolutionized because of it. The manufacturing industry will be. Everything in transport will be. Retail, e-tail, everything will be. And so I think the impact is going to be large, and we’re just getting started. We’re just getting started.”

The thing about tech is that it moves exponentially

“Now that’s kind of a first inning thing. The only trouble with a baseball analogy is that in the world of tech, things don’t – every inning is not the same. In the beginning the first inning feels like – it feels pretty casual and people are enjoying peanuts. The second inning for some reason is shorter and the third inning is shorter than that and the fourth inning is shorter than that. And the reason for that is because of exponential growth. Speed is accelerating.

And so from the bystanders who are on the outside looking in, by the time the third inning comes along, it’s going to feel like people are traveling at the speed of light next to you. If you happen to be on one of the photons, you’re going to be okay. But if you’re not on the deep learning train in a couple of two, three innings, it’s gone. And so that’s the challenge of that analogy because things aren’t moving in linear time. Things are moving exponentially.”

*Computing advances are no longer about transistors alone

“The easy way to think about that is that we can no longer rely – if we want to advance computing performance, we can no longer rely on transistor advances alone. That’s one of the reasons why NVIDIA has never been obsessed about having the latest transistors. We want the best transistors. There’s no question about it, but we don’t need it to advance. And the reason for that is because we advance computing on such a multitude of levels, all the way from architecture, this architecture we call GPU accelerated computing, to the software stacks on top, to the algorithms on top, to the applications that we work with. We tune it across the top, from top to bottom all the way from bottom to top. And so as a result, transistors is just one of the 10 things that we use. And like I said, it’s really, really important to us. And I want the best, and TSMC provides us the absolute best that we can get, and we push along with them as hard as we can. But in the final analysis, it’s one of the tools in the box.”

Colette M. Kress – NVIDIA Corp.

Data center revenue has tripled from a year ago

“Next, data center, record revenue of $409 million was nearly triple that of a year ago. The 38% rise from Q4 marked its seventh consecutive quarter of sequential improvement. Driving growth was demand from cloud service providers and enterprises building training clusters for web services, plus strong gains in high-performance computing, GRID graphics virtualization, and our DGX-1 AI super-computer.”

NVIDIA GPUs are at the center of AI

“AI has quickly emerged as the single most powerful force in technology, and at the center of AI are NVIDIA GPUs. All of the world’s major Internet and cloud service providers now use NVIDIA Tesla-based GPU accelerators, AWS, Facebook, Google, IBM, and Microsoft as well as Alibaba, Baidu, and Tencent. We also announced that Microsoft is bringing NVIDIA Tesla P100 and P40 GPUs to its Azure cloud.”

NVIDIA 4Q16 Earnings Call Notes

Jen-Hsun Huang

Enterprise is waking up to the power of AI

“And so I think the hyperscalers are going to continue to adopt GPU both for internal consumption, and cloud hosting for some time to come. And we’re just in the beginning of that cycle, and that’s one of the reasons why we have quite a fair amount of enthusiasm around the growth here. You mentioned Enterprise, and enterprise, it has all woken to the power of AI, and everybody understands that they have a treasure trove of data that they would like to find a way to discover insight from. In the case of real applications that we’re engaging now, you could just imagine that in the transportation industry, car companies creating self-driving cars, one car company after another needs to take all of their row data and start to train their neural networks for their future self-driving cars. And so they use our DGX or Tesla GPUs to train the networks, which is then used to run their cars running on DRIVE PX.”

We have to make VR headsets easier to use with fewer cables

“The early VR is really targeted at early adopters. And I think the focus of ensuring an excellent experience that surprises people, that delight people, by Oculus and by Valve and by Epic and by Vive, by ourselves, by the industry, has really been a good focus. And I think that we’ve delivered on the promise of a great experience. The thing that we have to do now is that we have to make the headsets easier-to-use, with fewer cables. We have to make it lighter, we have to make it cheaper. And so those are all things that the industry is working on, and as the applications continue to come online, you’re going to see that they’re going to meet themselves and find success. I think the experience is very, very clear that VR is exciting. ”

Deep learning is a breakthrough in the category of machine learning

“deep learning is a breakthrough technique in the category of machine learning, and machine learning is an essential tool to enable AI, to achieve AI. If a computer can’t learn, and if it can’t learn continuously and adapt with the environment, there’s no way to ever achieve artificial intelligence. Learning, as you know, is a foundational part of intelligence, and deep learning is a breakthrough technique where the software can write software by itself by learning from a large quantity of data. Prior to deep learning, other techniques like expert systems and rule-based systems and hand-engineered features, where engineers would write algorithms to figure out how to detect a cat, and then they would figure out how to write another algorithm to detect a car. You could imagine how difficult that is and how imperfect that is. It basically kind of works, but it doesn’t work good enough, well enough to be useful. And then deep learning came along. The reason why deep learning took a long time to come along is because its singular handicap is that it requires an enormous amount of data to train the network, and it requires an enormous amount of computation. And that’s why a lot of people credit the work that we’ve done with our programmable GPUs and our GPU computing platform and the early collaboration with deep learning.”

We haven’t found boundaries of problems that deep learning can solve

“Now, the reason why deep learning has just swept the world, it started with a convolution of neural networks, but reinforcement networks and time sequence networks and all kinds of interesting adversarial networks. And the list of types of networks, I mean, there are 100 networks being created a week, and papers are coming out of everywhere. The reason why is because deep learning has proven to be quite robust. It is incredibly useful, and this tool has at the moment found no boundaries of problems that it’s figured out how to solve. ”

You could achieve level 5 autonomous today with more chips

“DRIVE PX today is a one-chip solution for Level 3. And with two chips, two processors, you can achieve Level 4. And with many processors, you could achieve Level 5 today. And some people are using many processors to develop their Level 5, and some people are using a couple of processors to develop their Level 4. Our next generation, so that’s all based on the Pascal generation. That’s all based on the Pascal generation. Our next generation, the processor is called Xavier. We announced that recently. Xavier basically takes four processors and shrink it into one. And so we’ll be able to achieve Level 4 with one processor. That’s the easiest way to think about it. So we’ll achieve Level 3 with one processor today. Next year, we’ll achieve Level 4 with one processor, and with several processors, you could achieve Level 5.”

Colette Kress

Level 4 autonomy in Audi by 2020

“Jen-Hsun was joined on the CES stage by Audi of America’s President, Scott Keogh. They announced the extension of our decade-long partnership to deliver cars with Level 4 autonomy starting in 2020, powered by DRIVE PX technology. Audi will deliver Level 3 autonomy in its A8 luxury sedan later this year through its zFAS system powered by NVIDIA. We also shared news at CES of our partnership with Mercedes-Benz to collaborate on a car that will be available by year’s end. During the quarter, Tesla began delivering a new autopilot system powered by the NVIDIA DRIVE PX 2 platform in every new Model S and Model X, to be followed by the Model 3.”

NVIDIA 3Q16 Earnings Call Notes

Jen Hsun Hwang

Gaming is no longer just about gaming

“And then the third is gaming is no longer just about gaming. Gaming is part sports – part gaming, part sports and part social. There are a lot of people who play games just so they can hang out with their other friends who are playing games. And so it’s a social phenomenon and then, of course, because games are – the quality of games, the complexity of games in some such as League of Legends, such as StarCraft, the real-time simulation, the real-time strategy component of it, the agility – the hand-eye coordination part of it, the incredible teamwork part of it is so great that it has become sport. And because there are so many people in gaming, because there is – it’s a fun thing to do and it’s hard to do, so it’s hard to master, and the size of the industry is large, it’s become a real sporting event.”

The only way we’re going to be able to discern IoT is through AI

“And then the third – so hyperscale, enterprise computing, and then the third is something very, very new, it’s called IoT. IoT, we’re going have a trillion things. We’re going to have a trillion things connected to the Internet over time, and they’re going be measuring things from vibration to sound to images to temperature to air pressure to you name it. Okay? And so these things are going be all over the world and we’re going to measure – we’re going to be constantly measuring and monitoring their activity. And using the only thing that we can imagine that can help to add value to that and find insight from that is really AI. Using deep learning, we could have these new types of computers. And they will likely be on premise or near the location of the cluster of things that you have. And monitor all of these devices and prevent them from failing, or adding intelligence to it so that they add more value to what it is that people have them do.”

Tesla is 5 years ahead of the competition in self driving

“Yeah, that’s – I don’t know that I have really granular breakdowns for you, Craig, partly because I’m just not sure. But I think the dynamics are that self-driving cars is probably the most single most disruptive event – the most disruptive dynamic that’s happening in the automotive industry. It’s almost impossible for me to imagine that in five years’ time, a reasonably capable car will not have autonomous capability at some level, and a very significant level at that. And I think what Tesla has done by launching and having on the road in the very near-future here, a full autonomous driving capability using AI, that has sent a shock wave through the automotive industry. It’s basically five years ahead. Anybody who’s talking about 2021 and that’s just a non-starter anymore. And I think that that’s probably the most significant bit in the automotive industry. I just don’t – anybody who is talking about autonomous capabilities in 2020 and 2021 is at the moment re-evaluating in a very significant way.”

Colette Kress

Datacenter up 59%, driven by AI

“Cloud GPU computing has shown explosive growth. Amazon Web Services, Microsoft Azure and AliCloud are deploying NVIDIA GPUs for AI, data analytics and HPC. AWS has recently announced its new EC2 P2 instance, which scales up to 16 GPUs to accelerate a wide range of AI applications, including image and video recognition, unstructured data analytics and video transcoding.”

NVIDIA 2Q17 Earnings Call Notes

NVIDIA (NVDA) Jen-Hsun Huang on Q2 2017 Results

Deep learning is providing the majority of data center growth

“The vast majority of the growth comes from deep learning by far, and the reason for that is because high-performance computing is a relatively stable business, it’s still growing business, and I expect the high-performance computing to do quite well over the coming years. GRID is a fast-growing business.”

Experiencing growth in all of our businesses

“We’re experiencing growth in all of our businesses. Our strategy of focusing on deep learning, self-driving cars, gaming and virtual reality, these are markets where GPU makes a very significant difference, is really paying off.”

GPUs well suited for deep learning. We pushed all in to deep learning

“Deep learning, you may have heard, is a new computing approach. It’s a new computing model, and requires a new computing architecture. And this is where the parallel approach of GPUs is perfectly suited. And five years ago, we started to invest in deep learning quite substantially. And we made fundamental changes and enhancements for deep learning across our entire stack of technology, from the GPU architecture to the GPU design to the systems that GPUs connect into; for example NVLink to other system software that has been designed for it, like cuDNN and DIGITS, to all of the deep learning experts that we have now in our company. The last five years, we’ve quietly invested in deep learning because we believe that the future of deep learning is so impactful to the entire software industry, the entire computer industry that we, if you will, pushed it all in. And now we find ourselves at the epicenter of this very important dynamic, and it is probably – if there is one particular growth factor that is of great significance, it would be deep learning.”

GPUs helped make deep learning practical

” We’ve been in this industry since the very beginning, and deep learning was really ignited when pioneering researchers around the world discovered the use of GPUs to accelerate deep learning and made it practical, made it even practical to use deep learning as an approach for developing software. The GPU was a perfect match because the nature of the GPU is a sea of small processors, not one big processor, but a whole bunch of small processors.”

Deep learning has been infused into just about every internet service

“If you look at deep learning today, five years later, I think it’s a foregone conclusion that deep learning has been infused into just about every single Internet service to make them smarter, more intelligent, more delightful to consumers. And so you could see that the hyperscale adoption of deep learning is not only broad, it’s large-scale and it’s completely global. ”

My sense is that our lead is quite substantial

” My sense is that our lead is quite substantial, and our position is very good. But we’re not sitting on our laurels, as you can tell, and for the last five years we’ve been investing quite significantly. And so over the next several years, I think you’re going to continue to see quite significant jumps from us as we continue to advance in this area.”

Workloads in the datacenter have really changed from just text

“the type of workloads in the datacenter is really changed. Back in the good old days, it largely ran database searches but that has changed so much. It’s no longer just about text, it’s no longer just about data. The vast majority of what’s going through the Internet and what’s going through datacenters today, as you guys know very well, are images, there are voice, there an increasingly and probably one of the most important new data formats is live video.”

The architecture has to change with the workload

“And so if you think about datacenter traffic going forward, my sense is that the workload is going to continue to be increasingly high throughput, increasingly high multimedia content, and increasingly, of course, powered by AI and powered by deep learning. And so I think that’s number one. The second is that the architecture of the datacenter is recognizably, understandably changing because the workload is changing. Deep learning is a very different workload than the workload of the past. And so the architecture, it’s a new computing model, it recognized it needs a new computing architecture, and accelerators like GPUs are really, well, a good fit.”

Computer vision is one tiny sliver of the autonomous driving problem

“Computer vision and detection, object detection, is just one tiny sliver of the entire autonomous driving problem. It’s just one tiny sliver. And we’ve always said that autonomous vehicles, self-driving cars, is really an AI computing problem. It’s a computing problem because the processors needs not just detection but also computation. The CPU matters, the GPU matters, the entire system architecture matters. And so the entire computation engine matters. Number two, computing is not just a chip problem, it’s largely a software problem.”

Deep learning is Thor’s hammer that fell from the sky

” Deep learning is really machine learning supercharged, and deep learning is really about discovering insight in big data, in big unstructured data, in multi-dimensional data. And that’s what deep learning – that’s the – I’ve called it, it’s Thor’s hammer that fell from the sky, and it’s amazing technology that these researchers discovered. And we were incredibly, incredibly well prepared because GPUs is naturally parallel, and we put us in a position to really be able to contribute to this new computing revolution. But when you think about it in the context that it’s just – it’s software development, it’s a new method of doing software and it’s a new way of discovering insight from data. What company wouldn’t need it?”

Automotive ASPs for self driving cars will be much higher than infotainment

“Automotive ASPs for self-driving cars will be much higher than infotainment. It’s a much tougher problem. Every car in the world has infotainment. With the exception of some pioneering work or early – the best, the most leading-edge cars today, almost no cars are self-driving. And so I think that the technology necessary for self-driving cars is much, much more complicated than lane keeping, or adaptive cruise control, or first-generation and second-generation ADAS. The problem is much, much more complicated.”

Colette M. Kress – Chief Financial Officer & Executive Vice President

24% revenue growth

“Revenue continued to accelerate, rising 24% to a record $1.43 billion. We saw strong sequential and year-on-year growth across our four platforms, Gaming, Professional Visualization, Datacenter and Automotive.”

Datacenter revenue reached $151m

“Moving to datacenter; revenue reached a record $151 million, more than doubling year-on-year and up 6% sequentially. This impressive performance reflects strong growth in supercomputing, hyperscale datacenters and grid virtualization. Interest in deep learning is surging as industries increasingly seek to harness this revolutionary technology. Hyperscale companies remain fast adopters of deep learning, both for training and real-time inference, particularly for natural lingual processing, video and image analysis. Among them are Facebook, Microsoft, Amazon, Alibaba and Baidu. Major cloud providers are also offering GPU computing for their customers. Microsoft Azure is now using NVIDIA’s GPUs to provide computing and graphics virtualization.”