www.microsoft.com

Morgan Stanley TMT Conference

36 minutes


Keith Weiss: Excellent. Thank you, everyone, for joining us. My name is Keith Weiss. I run the U.S. software research effort here at Morgan Stanley. And I'm really pleased to have with us from Microsoft, Scott Guthrie, Executive Vice President of Cloud and AI. I think there's a couple of other things behind that. You have a wide scope of responsibility. But really a super interesting conversation ahead, given everything that's going on with Microsoft.

Before we get into that, something uninteresting, our disclosure statement. For important disclosures, please see the Morgan Stanley research disclosure website, at www.morganstanley.com/researchdisclosures. If you have any questions, please reach out to your Morgan Stanley sales representative.

Excellent. So, thank you so much for joining us. I think it's a fascinating time to be talking to you, given sort of everything that's going on within sort of the Microsoft ecosystem.

And I think we should just start with what everyone's kind of super excited about. From an investor standpoint, and especially for people who have been paying close attention, the pace of innovation and what we're seeing in terms of AI and generative AI capabilities coming into the solution portfolio has really impressed people and impressed them to the upside of kind of what our expectations are.

Could you talk to us about how this came together? Because this is something that didn't sort of come into the forefront over the last month or two. This is something you guys have been putting together for years. Could you talk to us about sort of how you guys have built out the AI capability within Microsoft and now how we're seeing it expressed across the product divisions?

Scott Guthrie: It's an exciting time. I think we sort of call it the Age of AI that we're entering, and it's probably going to be the most significant technology transformation in any of our lifetimes. And we've all experienced lots of big ones, but I do think over – and it's going to be over the next couple of years. So, that's not a statement around the next quarter or two. But I do think this is very, very profound and really going to change how business works, how society works, going forward.

And it's kind of been amazing on the technology side. I mean, this has been a bet that we've made going back many years now and deep partnership with the OpenAI team. And I know Sam's going to be here at the event I think later this week. It's been a great partnership, where we kind of made some mutual bets on building kind of what we call the AI supercomputer, which is kind of a service inside Azure that we provide which is really optimized around these very, very large language model trainings. And we kind of did jointly a whole bunch of architecture work kind of designing how they were going to build the models, how we were going to build the infrastructure, and really built something pretty special that allows these large language models to be trained very fast and iteratively.

And then kudos to the OpenAI team. They really pioneered a tremendous amount of kind of new ways of thinking about building these models. And the combination, I think, has really been magic the last six months. And I think the road ahead is going to be pretty exciting as we start to move from training these models, providing these models, to really embedding this now into every single app and experience.

And even at Microsoft you've seen, even this week, yesterday we announced our Dynamics 365 Copilot and our Power Platform Copilot. We shipped our GitHub Copilot last year, and you're going to see us kind of infuse this AI deeply throughout all of our applications. And it's, I think, going to be great for customers and really the next foundation of computing.

Keith Weiss: Right. So, if we think about it kind of structurally within Microsoft, it's not just the OpenAI partnership. You guys have a lot of your own kind of AI research that you do in-house. You acquired some interesting technology with Nuance and sort of their DAX platform. From what I understand, there's a centralized kind of AI kind of core functionality, and then it's up to the product teams to figure out sort of how to expose that through their own solutions. Is that the correct way to think about it?

Scott Guthrie: A little bit, yes. I mean, there's a core kind of what we call AI platform that we're building, and it's the same platform we offer to our external customers and partners. And so, the nice thing is what Office 365 or Nuance or Dynamics or GitHub are using it is the same platform infrastructure and the same capabilities that any external partner or customer can leverage as well. And we kind of believe that first-party and third-party symmetry is important. And so, there's a lot that we share.

And part of the opportunity with these large language models is the ability to kind of have them know a lot of stuff about a lot of things and be able to be used in lots of different domains. And then what we've built with our Azure OpenAI service, as an example, is the ability for organizations or internal teams to kind of provide fine-tunings on the model specific for a use case to make it even better. And that promise of being able to leverage the large language models, which is trained in the public web, and then that ability for, say, a Morgan Stanley or another customer to be able to take proprietary data and tune it even further and know that that model is only going to be used by you, not by us, it's not going to benefit anyone else, and you're going to control access, you're going to have the business model around it, is what I think enterprise customers in particular are looking for and all of the SaaS ISVs and startups that are going to serve those customers are going to need.

And I think – I don't want to claim it's perfect, but I think we've got something special there. And the fact that we're able to use it now for our first-party products and we're able to offer it to all of our customers and partners I think speaks to opportunity in the years ahead.

Keith Weiss: Right. And that's important both from the side of the equation if you're talking about, like, large enterprises, particularly in sort of regulated industries, you bring about both sort of, like, the security and sort of the data residency side of the equation, but also the AI, if you will, gets smarter to your particular business process. It's meant to solve or answer the questions specific to your business.

Scott Guthrie: I mean, AI learns with data. And so, one of the things you need to think about with AI is, as you provide data to that model and it learns, who owns that model, who owns that data. And that's why the trust is so important I think in this AI age. And again, our promise is your data is your data. It's not our data. You get to monetize it. You get to control it. And the foundation models won't learn from your data. Your instance learns, but not the models that are shared by others.

So, I think that promise – and frankly, the trust that we've earned over the years at Microsoft – and you earn trust in drips and you can lose it in buckets. So, it's really important to focus on that trust. But I think that puts us in a good position I think where I think people look at us versus maybe some of the other Big Tech companies and say, "Yes, I feel like I can trust Microsoft." And I think that hopefully makes us very good stewards in the years ahead, where people look at us as the trusted partner that's going to help them fully leverage this AI to its maximum potential.

Keith Weiss: Got it. One of the reasons I was so excited to talk to you at this point in time is I think there's some really kind of foundational questions that investors are asking about the underlying technology. And one of the biggest ones is kind of how to think about competition, how to think about what's going to make one model better than the other. And so, let me ask you the question. Like, how should we think about judging whether a GPT model from OpenAI is better than what Google is bringing to the equation or what Amazon is bringing to the equation? What are going to be the parameters of that competition?

Scott Guthrie: I think there's going to be two aspects. I mean, one is going to be on the raw technical capability of the model. And so, obviously, we're going to be very focused on making sure that the base model from the get-go is super competitive. And I think the whole world probably didn't know what large language models – only a small part of the world knew what large language models were a year ago. And I think ChatGPT, some of the things we've done with Bing, get GitHub Copilot, like, suddenly the world has woken up to, wow, this is pretty amazing stuff. And so, we're going to continue to see large language model innovations in the years ahead.

But I think the other thing to think about differentiation, to your point, is also going to be the signal of use cases of people actually using it. If you look at GitHub Copilot, which was the first really widely used large language model service in the world, and you look at the accuracy of the code that was generated last July versus what it is today, it's dramatically better today. And it's because as people are using it, the model is getting better, the accuracy is getting better.

And so, sometimes first time to market or first to market and that signal improvement can really start to differentiate these models beyond even the base capabilities that are in it. And I think that's partly why you're seeing us move as quickly as we are, whether it's GitHub, whether it's Microsoft 365, whether it's Dynamics, whether it's Power Platform, Nuance, etc. There's literally nothing in our portfolio that we're not very aggressively looking to leverage AI, partly because we also want to get that signal going. And when you have hundreds of millions of commercial customers using your products, that's a lot of good signal that's going to improve them. And that, I think, is going to further differentiate our models versus others hopefully in the market.

Keith Weiss: Got it. That's a good segue to the conversation on Bing and the new Edge browser. On one side of the equation, I think that the Bing announcement and the capabilities impressed a lot of people. And really, it was a great marketing event for Microsoft in bringing it to sort of the entire world, if you will. Like, these capabilities exist and they exist within Microsoft.

There was very quickly some blowback about sort of, well, some of the answers weren't right. Right? Sydney emerged kind of over the weekend, and it kind of freaked some people out. But it sounds like that's part of the learning process, that's part of making these models better, is they have to get that usage up.

Scott Guthrie: I think the day we announced the new Bing, one of the things that we were very clear on was this is going to be an evolution and we're going to learn and evolve and we won't get everything right and this is going to keep improving and we're going to do it really fast. And that's been the model or the approach the team has taken on the Bing side, and I think they've done a good job of reacting fast. In some cases, people are doing 200 prompts to try to cause the model to say something strange. But credit to the team, they're reacting fast. And we take it very seriously in terms of making sure AI is responsible, it's safe. That's kind of core to our DNA. And that's partly why it's not available to everyone yet. We start with a cohort of users, we learn, we improve. And we're going to make sure that we deliver this technology in a really safe, responsible way.

Keith Weiss: Got it. In terms of the sort of monetization avenues on a go-forward basis, I think one of the competitive advantages it seems like Microsoft has is all these avenues that you can sort of bring it out, you could productize it through. So, Bing being one of them. But like you were saying, like, GitHub and all the developer platforms is another one. Teams Premium.

There's one that we haven't heard from yet, Office, the overall productivity suite, but I guess that's to come at the March 16 event. But is it fair to say it's a kind of a similar kind of perspective? Because one of the things that is interesting of all these kind of innovations you've been putting out, they all have a price point behind it. Like, this isn't sort of innovation in the overall kind of software is moving forward, but Teams Premium is a SKU. Like, should the expectation be that's going to be the route forward across the entire portfolio?

Scott Guthrie: I think you're going to see – I mean, I guess the way to look at it would be what is the productivity win you're giving to the business, whether it's around making an employee more productive or making a specific business process more effective. Take the example of, say, GitHub Copilot, since that's a product that's GA today. We're now seeing the developers using GitHub Copilot are 55% more productive with it on tasks, and that's sort of measured by independent studies. And 40% of the code they're checking in is now AI-generated and unmodified.

And so, if I talk to a CIO or to a CTO and sort of say, "If I can give you 55% more developers overnight, would you take that," they're all looking for talent, they're all constrained in terms of the number of engineers they can hire. They're gladly going to pay for that. And we have a good price point for that, I think, that is sort of a no-brainer for them to pay. And it's a great service and a great business.

And I think there's going to be lots of opportunities here. When you look at AI, it's going to be additive. We're sort of adding new scenarios and taking cost out, enabling organizations to move faster, enabling them to be more productive. And so, I think from a margin perspective or from a revenue perspective, to your price point, I think these things are going to be additive to our overall business.

And in a lot of cases, some of these use cases cost a lot of money for an organization. Think of Nuance with healthcare, physicians, and physician visits. Physicians only see so many patients a day. If you can help them see significantly more patients and have a better experience, that is worth a lot.

Similarly, say, call centers and customer support. If you can deflect a case without even having to have a human answer it, you have happier customers. And again, you've taken a lot of cost out of the system.

And so, I think for each of these things there will be different ways we monetize. But in a world where we're sort of delivering massive productivity wins, the good news is there's lots – customers want to pay it because it ultimately takes their overall cost down. And I think there will be different opportunities for us to add additional value, going forward.

Keith Weiss: Got it. Outstanding. So, today, we're seeing a lot of this functionality be exposed through kind of existing applications. On go-forward basis, do you think this changes the paradigm of how people build applications? And could it potentially shift that pendulum we see between sort of buy-versus-build applications more towards the build side of the equation?

Scott Guthrie: I think there's going to be – I think in the short run I think the fastest way to get some of this AI value is going to be through finished apps – again, like the Copilot experiences that we're launching – because people are already trained on the apps and it just augments the apps, integrates with it, lets them move faster. And so, I think there's a huge opportunity there.

And then I think we're also going to see then the next generation of apps that are going to be built on the raw APIs and the services around it that are going to re-envision pretty much every experience that we see. I've told the story a few times about ecommerce. We kind of all take for granted you go in a web browser to an ecommerce site, there's categories on the left, you click. There's products, you click. You read the reviews. The price point. Is it in stock. You add it to your cart. And you check out. That was kind of codified 30 years ago, and it hasn't changed dramatically.

I think we're going to be in a place probably two holiday seasons from now where instead of browsing, I'm probably going to have a text box and I'm going to say I want to buy a gift for a family member, here's the price point, and I want it delivered by December 19, this is what they like. And it's going to find the products for me. And that's going to be a very different user experience. It's going to be a very different question-and-answer experience. I'm going to be able to ask it questions about the product versus scroll through hundreds of comments and reviews.

And I think every organization needs to start to be thinking about, "okay, how do I reinvent how I do" retail, wealth management, manufacturing, routing, customer support? And I think that is – some of it is going to be built, where people are going to build it themselves, and I think big brands are going to need to have more control. And a lot of it is going to be buy components of it and how do you compose them together. And I think part of what we're trying to do with the Microsoft Cloud is we do both. And then, also, being able to point to the fact that, hey, how we built GitHub Copilot or how we're building the Teams Premium or how we're building the Dynamics, you can use the exact same APIs that we are, gives us an opportunity to also talk credibly to other software vendors and to other enterprises about how they can do the same thing we are. And I think there's a big opportunity.

Keith Weiss: Okay. One of the presumptions I made in an earlier question was that ChatGPT – the Bing announcement, more succinctly – was a great marketing event for Microsoft. Is that correct? And has that spurred more customer conversations for you guys? And maybe more broadly, where are we in terms of the customer conversation around these generative-AI models or AI, more broadly? Like, how far into this opportunity do you think we are?

Scott Guthrie: I think we're still early innings. I mean, I think the thing that's been great about I think ChatGPT and then also around Bing is the fact that, like, end users can do it. The number of people I've talked to who maybe haven't used all of the new products from all the tech companies, but seemed to have tried those two, and said, "Hey, we're using it now. My children are using it for homework," which they're not supposed to. "We're using it in a variety of use cases." I hear more and more interesting ones. I think has actually made what is been a very technical concept – large foundational AI models, transformer-based learning – like, most people didn't know in the world that even meant 12 months ago. And yet, hundreds of millions of people have heard of ChatGPT in Bing now and tried it.

And so, I think that is actually making it much more real, which is giving us an opportunity to then say, "Hey, let's show you how you can use this in customer support. Let's show you how we can use it with developer productivity. Let's show you how we can use it for sales productivity." And it's a good conversation starter. And again, people are looking for solutions that integrate with their workflows that they already have and help them kind of accelerate even more.

Keith Weiss: Got it. One of the things that investors are struggling with a little bit here is it seems like just a massive opportunity ahead of Microsoft. And it's something that Satya has talked about is, I mean, this is what's expanding sort of IT as a percentage of GDP from 5% to 10% over the next 10 years in kind of his way that he lays out that market opportunity. But in the near term, we're talking about cloud optimizations. In the near term, we're seeing sort of Azure growth decelerate. When –? Like, can you give us some –? Well, one, can you give us some kind of perspective on kind of what do those cloud optimizations mean? Like, what are customers doing? Are they teaching their views on how they want to use cloud fundamentally? Or is this more of a short-term tactical impact that's just about sort of getting in line with budgets?

Scott Guthrie: Well, I think, cloud optimization has been sort of a core part of the cloud journey for 10-plus years now. So, I don't think it's necessarily new, per se. In general, the typical pattern we've seen going back many years is you either migrate a workload to the cloud or you build a new workload in the cloud. You then optimize it. And then you reinvest the savings. And you rinse and repeat for the next workload or the next use case.

And we like that optimization process. I mean, I think – and we have dedicated teams at Microsoft that help our customers with it. Because we know it earns loyalty, we know it earns confidence to move more. And at the end of the day, if we can help our clients and customers and partners get more out of their investments, we kind of know they'll spend more with us and they'll invest more in digital technology, which increases the overall size of the pie.

So, in general, we like that. And the types of optimizations people do, sometimes when they're first moving they have a large test environment. And they kind of go, like, "Okay, can I do more tests in production? I can shrink that test environment a little bit. Can I take advantage of things like reserved instances or new cost-savings plans that we have in Azure to get better – commit to a longer-term benefit and reduce my per unit cost a little bit on an ongoing basis?" And then there's sometimes where they kind of right-size VMs or right-size databases.

And so, that's all natural. There's only so much optimization you can do until you're done. And so, while people are optimizing, it's not like they're going to optimize it down to zero. At some point, you get done, and then you go on to the next workload.

What's happened in the last six months or nine months has been, as the macro situation has been uncertain, people have been, I would say, optimizing even more. And they are sometimes holding on to those savings a little bit longer before they reinvest it.

But I haven't heard really from any customer a long-term change in terms of cloud. And there's always more workloads, there's always more use cases. And as we've been talking about with AI, if you're not constantly reinventing yourself with digital technology, you're going to be under severe competitive pressure. And so, I don't so much worry about the long term, but it does lead, obviously, to short-term questions in terms of that optimization journey and what exactly the impact is. But again, longer term, I'm not hearing any changes.

Keith Weiss: Right. And that concept of, like, optimizations can only take place so long, does that kind of – is that what Satya is talking about when he says that he thinks this optimization activity lasts for a year but is unlikely to go significantly beyond that?

Scott Guthrie: Well, I think we're trying to be a little careful in terms of giving guidance on a specific quarter or more annualized basis, and I'm not trying to say that. Because I think the reality is we'll see. But I think at some point when you optimize a specific workload, you can't optimize it anymore. And so, there is a finite amount on a workload basis. And that's why, again, we ultimately feel very good in terms of the long term for cloud and don't see any kind of strategic shifts that our clients are making. It's more a case of it is going to have at times a dampening effect in the shorter term. But again, the more they optimize, the more value they get, the more they generally in the long term want to invest with us more. And again, I look at AI and I look at other new use cases that are coming out and hear a lot of excitement around people saying, "Yes, I've got to be ultimately doing more of this and this and this." And so, I do think those reinvestment savings, for me, is not in doubt. Obviously, we're all wondering exactly the win on a quarter-by-quarter basis. But again, I have confidence in the long term.

Keith Weiss: Got it. Got it. I want to ask a question about Azure gross margins, but it's kind of like a roundabout question. When you guys did the Bing announcement, Satya pretty sort of directly went sort of after kind of the gross margins of the key competitor there in terms of Google, saying that this is going to be a lower gross margin business and we're willing to spend on that. And it didn't take long for investors to say, well, if it's lower gross margins for search, is this going to be lower gross margins for Azure and the other cloud business as they roll more of this AI functionality beneath those platforms? So, is that the case? Like, because this is more computationally intensive, because you have to bring in GPUs to do the training, that this is going to be a compressive impact on overall cloud gross margins for Microsoft?

Scott Guthrie: I think, overall, the thing I would probably point to is going to be the fact that these are new workloads and the fact that it really opens up more top line revenue. And again, for a lot of these use cases – take a developer. If you can make that developer 55% more productive, I've got to believe there's a lot of gross margin in there. Because that ultimately translates into real opportunity to transform how an organization gets productivity out of their employees.

And so, I do think we're going to see, depending on the use case for AI, different ways that we'll monetize different margin structures. But I ultimately look at if we can keep growing productivity dramatically for businesses, that probably is going to be definitely long-term opportunity from a TAM revenue perspective, and I think it's going to allow us to maintain some good margins as we do it. And obviously, that will continue to evolve in the quarters and years ahead as people take maximum advantage of it.

I think the other thing on AI that's worth thinking about is that AI has gravity; meaning, you can't go faster than the speed of light. And so, if you've got an application or a database and you've got an AI model, the further they're apart, the slower the network path and the calls between them are. And for a lot of these use cases – take something like GitHub Copilot. It's not like we're making one AI inference. Like, we're doing it on every keystroke. And the further that is apart, the slower the experience.

And I do think we're also going to see as we look at AI opportunities with Azure, specifically, is there's both the direct Azure AI model opportunity, but there's also the fact that people are going to increasingly want to move their apps and their databases into Azure to be close to those large models. And that's also going to be an opportunity for us to not only sell more AI, but also to sell more VMs, more storage, more databases, more everything. And I think we also see that as a real opportunity, both with customers that we have today, but also there's a lot of customers that we don't have today. Take this particular zip code, has not been our strongest because it's very much open source-based developers, which has not been Azure and Microsoft's historical strength. But this is creating a conversation where people are, like, "Look, we want to take advantage of these large language models. Like, we want to talk about how we could use them." And we need to first show our value with the models. We need to first show the capabilities of Azure. But I do think this was going to be a great door opener for a lot of customers that haven't really given us a hard look yet historically, because they already had a cloud. And the fact that the OpenAI models run exclusively on Azure I think ultimately is going to be a big differentiator for us.

Keith Weiss: Got it. Got it. So, both through sort of pricing and volume, you can make up for any at a higher compute intensity needed in this type of process?

Scott Guthrie: And we're continuing to just sort of really optimize these models. I think a lot of people were surprised last week when OpenAI lowered their prices by a factor of 10. That's partly because they found a way to lower the cost of inferencing correspondingly. And they said, "Hey, we can get a lot more revenue, open up a lot more opportunities when it's more effective." And that model, that's cost-optimized. It didn't exist 30 days ago.

So, we're in still early innings, and there's still a lot of optimization and learnings that we're doing. And it's going to be fairly dynamic, but I think it's going to be exciting. Because it really is, again, just looking at some of the ChatGPT use cases that people have posted about or the Bing use cases people have posted about, they're already using it in fascinating ways that none of us, I think, would have thought of a year ago. And I think we're going to see far more use cases over the next year.

Keith Weiss: Okay. That optimization activity that you were talking about that is bringing down inference costs, I would assume that's really just kind of getting started right now. But it's a motion, it's a muscle memory that Microsoft has. I mean, you guys have been driving up cloud gross margins and Azure gross margins pretty materially over the past couple of years.

One of the questions I get from investors is the fact that you guys don't have a GPU design of your own, does that inhibit sort of how far you go on that sort of optimization curve? Google has their own GPUs and Amazon has their own design for GPUs, but Microsoft doesn't. Is that any significant inhibitor in terms of how efficient you can get?

Scott Guthrie: Well, in general, I mean, I think we're looking for how do we optimize everything, whether it's the silicon GPUs, whether it's the network interconnects, whether it's data center designs, whether it's server hardware, whether it's fiber. And some of those things we're doing organically. Two of the acquisitions we've done in the last six months are hollow core fiber through a company called Luminosity and Fungible, which does storage and IO optimization for DSPs. And so, these are very specific scenarios that maybe five years ago would not have made any sense for us to be investing in. Now, at the scale that we're operating on, it makes a lot of sense to invest in, and you're going to continue to see us innovate both organically and inorganically to kind of optimize every layer of the stack.

And part of the reason why not just OpenAI, but if you look at the other large language model startups out there, are using Azure is because we have some pretty differentiated hardware with our AI supercomputer. And that, again, includes silicon hardware, network, power, data center, and a whole bunch of other design elements. And you're going to continue to see us innovate on that.

So, we're going to be looking for opportunities at every layer of the stack to do optimizations. And I think the partnership we have with OpenAI and the fact that we're building all of our own apps deeply taking advantage of these models is also giving us real signal on what optimization really is going to matter and how do we continue to be on the bleeding edge of optimizations. And I do think that signal for us has helped a lot in terms of what we've built. We wouldn't have built it the way we built it without the partnership we had and without some of these early applications. And I think that ultimately hopefully is going to help, again, both at the platform layer and at the app layer give us some good differentiation in the years ahead.

Keith Weiss: Outstanding. Unfortunately, that takes us to the end of our allotted time. I could have this conversation all day long. But thank you so much, Scott, for joining us. This has been a great conversation.

Scott Guthrie: Thanks for having me. Thanks, everyone.