Advanced Micro Devices, Inc. (AMD) Q3 2023 Earnings Call Transcript

October 31, 2023

Advanced Micro Devices, Inc. (NASDAQ:AMD) Q3 2023 Earnings Conference Call October 31, 2023 5:00 PM ET

Company Participants

Mitch Haws - Head of IR
Lisa Su - Chair & CEO
Jean Hu - EVP, CFO, & Treasurer

Conference Call Participants

Toshiya Hari - Goldman Sachs
Aaron Rakers - Wells Fargo
Joseph Moore - Morgan Stanley
Timothy Arcuri - UBS
Vivek Arya - Bank of America Securities
Blayne Curtis - Barclays
Mattew Ramsay - TD Cowen
Ross Seymore - Deutsche Bank
Harsh Kumar - Piper Sandler
Stacy Rasgon - Bernstein Research
Christopher Rolland - Susquehanna

Operator

Greetings, and welcome to the AMD Third Quarter 2023 Earnings Conference Call. At this time, all participants are in a listen-only mode. A brief question-and-answer session will follow the formal presentation. [Operator Instructions] As a reminder, this conference is being recorded.

And it is now my pleasure to introduce to you Mitch Haws, Vice President, Investor Relations. Thank you, Mitch. You may begin.

Mitch Haws

Thank you, John, and welcome to AMD's third quarter 2023 financial results conference call. By now, you should have had the opportunity to review a copy of our earnings press release and the accompanying slides. If you had not had the chance to review these materials, they can be found on the Investor Relations page of amd.com.

We will refer primarily to non-GAAP financial measures during today's call, and the full non-GAAP to GAAP reconciliations are available in today's press release and slides posted on our website. Participants on today's conference call are Dr. Lisa Su, our Chair and Chief Executive Officer; and Jean Hu, our Executive Vice President, Chief Financial Officer and Treasurer. This is a live call and will be replayed via webcast on our website.

Before we begin, I would like to note that Forrest Norrod, Executive Vice President and General Manager, Data Center Business Solutions unit, will attend the UBS Technology Conference on Tuesday, November 28. AMD will host its Advancing AI Event on December 6, when AMD and its key ecosystem partners and customers will showcase the AMD products and partnerships that will shape the advancement of AI. The event will

Jean Hu, Executive Vice President, Chief Financial Officer and Treasurer, will attend the Barclays Global Technology Conference on Thursday, December 7. And our fourth quarter 2023 quiet time is expected to begin at the close of business on Friday, December 15.

Today's discussion contains forward-looking statements based on current beliefs, assumptions and expectations, speak only as of today and as such, involve risks and uncertainties that could cause actual results to differ materially from our current expectations. Please refer to the cautionary statement in our press release for more information on factors that could cause actual results to differ materially.

With that, I will hand the call over to Lisa. Lisa?

Lisa Su

Thank you, Mitch, and good afternoon to all those listening in today. We executed well in the third quarter, delivering strong top line and bottom line growth, achieving multiple milestones on our AI hardware and software road maps and significantly accelerating our momentum with customers for our AI solutions.

In PCs, there are now more than 50 notebook designs powered by Ryzen AI in market, and we are working closely with Microsoft on the next generation of Windows that will take advantage of our on-chip AI Engine to enable the biggest advances in the Windows user experience in more than 20 years.

In the Data Center, multiple large hyperscale customers committed to deploy Instinct MI300 accelerators, supported by our latest ROCm software suite and the growing adoption of an open hardware-agnostic software ecosystem. Looking at the third quarter financial results, revenue grew 4% year-over-year and 8% sequentially to $5.8 billion, driven by record server CPU revenue and strong Ryzen processor sales.

Turning to the segment results. Data Center segment revenue of $1.6 billion was flat year-over-year and up 21% sequentially as solid demand for both 3rd and 4th Gen EPYC processor families resulted in record quarterly server processor revenue. We gained server CPU revenue share in the quarter as 4th Gen EPYC CPU revenue grew more than 50% sequentially, crossing over to represent a majority of our server processor revenue and unit shipments.

In cloud, while the demand environment remained mixed in the quarter, EPYC CPU revenue grew by a strong double-digit percentage sequentially as hyperscalers expanded deployments of EPYC processors to power their internal workloads and public instances while optimizing their infrastructure spend. Nearly 100 new AMD-powered cloud instances launched in the quarter from Amazon, Google, Microsoft, Oracle, Tencent and others, including multiple general instances that deliver leadership performance for general purpose, HPC, bare metal and memory optimized workloads.

In enterprise, while overall demand remains soft, we are seeing strong indications that the significant performance and TCO advantages of Genoa and our expanded go-to-market investments are paying off as enterprise revenue grew by a double-digit percentage sequentially. We closed multiple new wins with leading automotive, aerospace, financial services, pharmaceutical and technology customers, and the number of enterprise customers actively testing EPYC platforms on-prem increased significantly quarter-on-quarter.

We also expanded our 4th Gen EPYC processor portfolio with the launch of our Ciena processors that deliver leadership energy efficiency and performance for intelligent edge and telco applications. Dell, Lenovo, Super Micro and others launched new platforms that expand our EPYC CPU TAM to address telco, retail and manufacturing applications. With the launch of Ciena, we now offer the industry's most performant and most energy-efficient portfolio of server processors across cloud, enterprise, technical, HPC and edge computing.

I am very pleased with the momentum we have built for our EPYC CPU portfolio. We are building on this momentum with our next-gen Turin server processors based on our new Zen 5 core that delivers significant performance and efficiency gains. Turin is in the labs of our top customers and partners now, and customer feedback has been very strong and we're on track to launch in 2024.

Looking at our broader Data Center portfolio, we made significant progress in our Data Center GPU business in the third quarter as the multiyear investments we have made in our hardware and software road maps resulted in significant customer traction for our next-generation Instinct MI300 accelerators and particularly our Instinct MI300X GPU that delivers leadership inferencing and training performance.

On the hardware side, bring-up and validation of our MI300A and MI300X accelerators continue progressing to plan with performance now meeting or exceeding our expectations. Production shipments of Instinct MI300A APUs started earlier this month to support the El Capitan Exascale supercomputer, and we are on track to begin production shipments of Instinct MI300x GPU accelerators to lead cloud and OEM customers in the coming weeks.

On the software side, we further expanded our AI software ecosystem and made great progress enhancing the performance and features of our ROCm software in the quarter. In addition to ROCm being fully integrated into the mainline PyTorch and TensorFlow ecosystems, Hugging Face models are now regularly updated and validated to run on Instinct accelerators and other supported AMD AI hardware.

AI start-up, Lemon.ai announced they achieved software parity with CUDA for LLMs running on Instinct MI250 GPUs, enabling enterprise customers to easily deploy production-ready LLMs fine-tuned for their specific data on Instinct MI250 GPUs with minimal code changes. We also strengthened our AI software capabilities with the strategic acquisitions of Mipsology and Nod.ai. Mipsology is a long-standing partner with proven expertise delivering AI software and solutions running on top of our adaptive SoCs for data center, edge and embedded markets.

Nod.ai adds a highly experienced team with a track record of substantial contributions to open-source AI compilers and industry-leading software already used by many of the largest cloud enterprise and AI companies. Nod's compiler-based automation software can significantly accelerate the deployment of highly performant AI models optimized for our Instinct, Ryzen, EPYC, Versal and Radian processors.

Based on the rapid progress we are making with our AI road map execution and purchase commitments from cloud customers, we now expect Data Center GPU revenue to be approximately $400 million in the fourth quarter and exceed $2 billion in 2024 as revenue ramps throughout the year. This growth would make MI300 the fastest product to ramp to $1 billion in sales in AMD history. I look forward to sharing more details on our progress at our December AI event.

Turning to our Client segment. Revenue increased 42% year-over-year and 46% sequentially to $1.5 billion. Sales of our Ryzen 7000 processors featuring our industry-leading Ryzen AI on-chip accelerator, grew significantly in the quarter as inventory levels in the PC market normalized and demand began returning to seasonal patterns.

Revenue for our latest-generation client CPUs powered by our Zen 4 core more than doubled sequentially as we saw strong demand for our Ryzen 7000 Series notebook and desktop processors that deliver both leadership energy efficiency and performance across a wide range of workloads.

In commercial, we launched our first Threadripper Pro workstation CPUs based on our Zen 4 Core that deliver unmatched performance for multi-threaded professional design, rendering and simulation applications. Dell, HPE and Lenovo announced an expanded set of workstations powered by new Threadripper Pro processors as we focus on growing this margin-accretive portion of our client business.

Looking forward, we are executing on a multiyear Ryzen AI road map to deliver leadership compute capabilities built on top of Microsoft's Windows software ecosystem to enable the new generation of AI PCs that will fundamentally redefine the computing experience over the coming years.

Now turning to our Gaming segment. Revenue declined 8% year-over-year and 5% sequentially to $1.5 billion as lower semi-custom revenue was partially offset by increased sales of Radeon GPUs. Although semi-custom SoC sales declined in line with our projections for this point in the console cycle, overall revenue for this console generation continues tracking significantly higher than the prior generation based on strong demand for Microsoft and Sony consoles.

In Gaming Graphics, revenue grew both year-over-year and sequentially, driven by increased demand in the channel. We expanded our Radeon 7000 series with the launch of new RX 7000 Series enthusiast desktop GPUs that offer leadership price performance for 1440p gamers.

Turning to our Embedded segment. As we expected, revenue decreased 5% year-over-year to $1.2 billion. Sequentially, revenue declined 15% as lead times normalized and customers focused on reducing inventory levels. We expanded our leadership Versal SoC portfolio in the quarter with the launches of our first adaptive SoCs with on-chip HBM memory that deliver significant performance and efficiency for memory-bound data center, network, test and aerospace applications.

We also announced our next-generation space-grade Versal SoC that integrates an enhanced AI Engine and is the industry's only solution that supports unlimited reprogramming during development and after deployment. For the fintech market, we launched our latest Alveo Accelerator card that delivers a 7x improvement in latency compared to our prior generation and has already been deployed by multiple trading firms in their ultra-low latency training platforms.

Since closing our acquisition of Xilinx a little over 1.5 years ago, our Embedded business has grown significantly, driven by our leadership products. Looking ahead, based on our current visibility, we expect Embedded segment revenue to decline sequentially as customers continue working through elevated inventory levels through the first half of 2024. Over the medium term, we see strong growth opportunities for our Embedded business based on our significant design win traction and our broad and differentiated portfolio of embedded FPGAs, CPUs, GPUs and adaptive SoCs that can address a larger portion of our customers' compute needs.

In summary, I'm pleased with our third quarter financial results, driven by the significant acceleration of Zen 4 server and client processor sales. Looking at the next couple of quarters, we expect strong growth in our Data Center business, driven by both EPYC and Instinct processors. This growth will be partially offset by softening demand in our Embedded business and lower semi-custom revenue, given where we are in the console cycle.

As the PC market returns to seasonal patterns, we believe we are well positioned to gain profitable share in the premium and commercial portions of the market based on the strength of our product offerings. We are focused on accelerating our leadership AI capabilities across our entire product portfolio, executing on our hardware and software road maps and expanding our enterprise computing footprint. I look forward to sharing more details on our AI progress in a few weeks at our together we advance AI event.

Now I'd like to turn the call over to Jean to provide additional color on our third quarter results and our outlook for Q4. Jean?

Jean Hu

Thank you, Lisa, and good afternoon, everyone. I'll start with a review of our financial results for the third quarter and then provide our current outlook for the fourth quarter of fiscal 2023. We delivered better than expected third quarter results with revenue of $5.8 billion and diluted earnings per share of $0.70. On a year-over-year basis, revenue increased 4% as growth in the Client segment revenue was partially offset by lower Gaming and Embedded segment revenue. Revenue was up 8% sequentially, driven by growth in both the Client and the Data Center segment.

Gross margin was 51%, up approximately 1 percentage point year-over-year primarily driven by stronger Client segment revenue and product mix. Operating expenses were $1.7 billion, an increase of 12% year-over-year, primarily driven by higher R&D investment to support our significant AI growth opportunity. Operating income was $1.3 billion, representing a 22% operating margin. Taxes, interest expense and other was $141 million. For the third quarter, diluted earnings per share was $0.70 compared to $0.67 in the same period last year.

Now turning to our reportable segment. Starting with the Data Center segment, revenue was $1.6 billion, flat year-over-year as growth in the EPYC processor sales was offset by a decline in adaptive SoC product sales. Data Center revenue grew 21% sequentially, primarily driven by strong sales of our 4th Gen EPYC processors to both cloud and enterprise customers. Data Center segment operating income was $306 million or 19% of revenue compared to $505 million or 31% a year ago. Lower operating income was primarily due to increased R&D investment to support future AI revenue growth and product mix.

Client segment revenue was $1.5 billion, up 42% year-over-year, primarily driven by higher sales of Ryzen mobile processors. On a sequential basis, revenue grew 46% as PC market conditions continued to improve, and we ramped our Ryzen 7000 series to meet strong demand. Client segment operating income was $140 million or 10% of revenue compared to an operating loss of $26 million a year ago, driven by higher revenue and a disciplined OpEx management. We are pleased that the Client segment returned to profitability in the third quarter.

Gaming segment revenue was $1.5 billion, down 8% year-over-year, primarily due to a decrease in semi-custom revenue, partially offset by increase in Radeon GPU sales. On a sequential basis, Gaming segment revenue declined 5%, in line with our expectations as we are now in the fourth year of the console cycle. Gaming segment operating income was $208 million or 14% of revenue compared to $142 million or 9% a year ago, primarily driven by higher Radeon GPU revenue.

Embedded segment revenue was $1.2 billion, down 5% year-over-year, primarily due to lower sales to the communication market. On a sequential basis, Embedded segment revenue declined 15%, primarily due to inventory correction at customers in several end markets. Embedded segment operating income was $612 million or 49% of revenue compared to $635 million or 49% a year ago.

Turning to the balance sheet and the cash flow. During the quarter, we generated $421 million in cash from operations and the free cash flow was $297 million. In the fourth quarter, we expect to pay approximately $550 million in cash taxes, primarily due to previously deferred taxes from California disaster relief efforts made available by the IRS.

Inventory decreased sequentially by $122 million to $4.4 billion. At the end of the quarter, cash, cash equivalents and short-term investment was strong at $5.8 billion. We returned $511 million to shareholders, repurchasing 4.8 million shares and we have $5.8 billion in remaining share repurchase authorization.

Now turning to our fourth quarter 2023 outlook. We expect revenue to be approximately $6.1 billion, plus or minus $300 million, an increase of approximately 9% year-over-year and 5% sequentially. Year-over-year, we expect revenue for the Data Center and the Client segments to be up by strong double-digit percentage, the Gaming segment to decline, given where we are in the console cycle, and the Embedded segment to decline due to additional softening of demand in the embedded market.

Sequentially, we expect Data Center segment to grow by strong double-digit percentage, Client segment revenue to increase and the Gaming and Embedded segment to decline by double-digit percentage. We expect non-GAAP gross margin to be approximately 51.5%, non-GAAP operating expenses to be approximately $1.74 billion, non-GAAP effective tax rate to be 13% and the diluted share count is expected to be approximately 1.63 billion shares.

In closing, I’m pleased with our execution in the third quarter with year-over-year growth in revenue, gross margin and earnings per share. In the fourth quarter, we expect to benefit from strong Data Center and Client momentum, driven by MI300 AI accelerated ramp and the strength of our high performance leadership Zen 4 family of products despite lower sales in the Gaming segment and additional softening of demand in the embedded market.

Looking ahead, the investment we are making in AI across our Data Center, Client, Gaming and the Embedded segment enable us to offer one of the best industry’s broadest portfolio, targeting the most compelling opportunities and positioning us to drive long-term profitable growth.

With that, I’ll turn it back to Mitch for the Q&A session.

Mitch Haws

Thank you, Jean. John, we're happy to poll the audience for questions.

Question-and-Answer Session

Operator

Thank you, Mitch. We will now be conducting a question-and-answer session. [Operator Instructions] And the first question comes from the line of Toshiya Hari with Goldman Sachs. Please proceed with your question.

Toshiya Hari

Great. Thank you so much. Lisa, I had two questions. My first one is on the Data Center GPU business. You talked about '24 revenue potentially exceeding $2 billion. I was hoping you could provide a little bit more color. What percentage of this is AI versus supercomputing or other applications? Within AI, maybe talk about the breadth of your customer lineup. And how should we think about which workloads you're addressing, again, within the context of AI? Is it primarily training or inference or both?

Lisa Su

Great. Thanks, Toshiya, for the question. So look, we've made significant progress on the overall MI300 program. I think we're very happy with how the technical milestones look. And then also, we've made significant progress from a customer side. Your question as to how the revenue evolves, so the way to think about it is, in the fourth quarter, we said revenue would be approximately $400 million, and that's mostly HPC with some -- the start of our AI ramp.

And then as we go into the first quarter, we actually expect revenue to be approximately similar in that $400 million range. And that will be mostly AI so with a very small piece being HPC. And as we go through 2024, we would expect revenue to continue to ramp quarterly, and again, it will be mostly AI. Within the AI space, we've had very good customer engagement across the board from hyperscalers to OEMs, enterprise customers and some of the new AI start-ups that are out there.

From a workload standpoint, we would expect MI300 to be on both training and inference workloads. We're very pleased with the inference performance on MI300, so especially for large language model inference, given some of our memory bandwidth and memory capacity. We think that's going to be a significant workload for us. But I think we would see a broad set of workloads as well as broad customer adoption.

Toshiya Hari

Thank you. And then as my follow-up, a question on the server CPU side. You talked about Genoa growing really nicely in the quarter. I think you talked about both units and volume being bigger than its predecessor. Is the growth that you're seeing or the growth that you saw in Q3 and the growth that you're guiding to for Q4, is this primarily a function of share growth or are you actually seeing a pickup in the overall market?

And I ask the question because, obviously, year-to-date, there's been a significant shift away from traditional compute to accelerated computing, but are you actually starting to see signs of stabilization or even improvement on the traditional compute side? Thank you.

Lisa Su

Sure. So the way I would frame it is we're very pleased with our third quarter performance as it relates to EPYC overall. I think the 4th Gen EPYC, so that's Genoa Plus Bergamo actually ramped very nicely. We got to a crossover in the third quarter, which is a little bit ahead of what we had previously forecasted. And when I look underneath that, I would say with a strong growth in both cloud. Cloud was strong so strong double-digits. The adoption is pretty broad across first-party and third-party workloads and new instances.

And then on the enterprise side, we've also seen some nice growth across our OEMs. And so from the standpoint of, is it the market recovery or is it share gain? I think it's some of both. From a market standpoint, I would say it’s still mixed. I think enterprise is still a little bit mixed depending on sort of which region from a macroeconomic standpoint. Cloud depends a bit on the customer set. But overall, I think we’re pleased with the progress and the leadership of EPYC has ended up, allowing us to grow substantially in the third quarter and then into the fourth quarter.

Operator

And the next question comes from the line of Aaron Rakers with Wells Fargo. Please proceed with your question.

Aaron Rakers

Yeah. Thank you for taking the questions. Just to build off that last question, Jean, I think last quarter, you kind of endorsed the notion that your Data Center business would grow. I think it was in the high-single digit range. I think you started the year thinking like 10. So I guess the question is, do you still see that kind of growth rate setup? And how has that $400 million evolved underneath that? Has that -- was it $300 million now going to $400 million? Just how has that changed over the course of the last quarter just to level set that Data Center expectation?

Jean Hu

Yeah. So I think for the second half, we said we expect Data Center business to grow approximately 50% versus first half. But right now based on what we are seeing, we continue to see in that similar range of that 50%. So we are very happy and pleased about the strong momentum of our Data Center business. On the GPU side, Lisa mentioned about $400 million, around $400 million. As we go through the quarter, we have a strong engagement with the customers. So we do see the progress continues and we see customers placing orders. So that's why when we go through the quarter, we start to increasingly confident about the revenue profile in Q4 we are guiding.

Lisa Su

Yeah. And Aaron, if I could add to that -- if I can just add to that. I think what we've seen is the adoption rate of our AI solutions has given us confidence in not just the Q4 revenue number but also sort of the progression as we go through 2024.

Aaron Rakers

Yeah. That's helpful. And maybe just the follow-up, how would you characterize the supply side of the equation? As you look at that $2 billion number, do you feel confident that you've got adequate visibility in the supply side to hit those expectations, any update on that side?

Lisa Su

Sure, Aaron. So we've been planning the supply chain for the last year and we're always planning for success. So certainly, for the current forecast of greater than $2 billion, we have adequate supply. But we have also planned for a supply chain forecast that could be significantly higher than that, and we would continue to work with customers to build that out.

Operator

And the next question comes from the line of Joe Moore with Morgan Stanley. Please proceed with your question.

Joseph Moore

Great. Thank you. Following up on the Data Center GPU, can you talk about the breadth of customers that you might see there? How -- I assume it's fairly concentrated in year one, but you also did mention multiple hyperscalers. Can you just give us a sense for how concentrated that might be?

Lisa Su

Yeah. Sure, Joe. So we've been engaging broadly with the customer set. I think in the last earnings call, we said that our engagements had increased 7 times and so there is a lot of interest in MI300. We will start, let's call it, more concentrated in cloud, sort of several large hyperscalers. But we're also very engaged across the enterprise and there's a lot of interest. Our partnerships with the OEMs are quite strong. And when we think about sort of the breadth of customers who are looking for AI solutions, we certainly see an opportunity, especially as we get beyond the initial ramp to broaden the customer set.

Joseph Moore

Great. And now that you're getting a look at volume in that space, can you talk about, are the gross margins there going to be comparable to your other Data Center businesses?

Jean Hu

Yeah. So on the gross margin side, we do expect our GPU gross margin to be accretive to corporate average. Of course, right now, we are at a very, very early beginning of the ramp of the product. As you probably know, typically when you ramp new product, it takes some time to improve yield, testing time, manufacturing efficiency. So typically, it takes a few quarters to ramp the gross margin to a normalized level. But we are quite confident that our team is executing really well.

Operator

And the next question comes from the line of Timothy Arcuri with UBS. Please proceed with your question.

Timothy Arcuri

Lisa, I also wanted to ask about that $2 billion number for Data Center GPU next year. That's still a pretty small portion, obviously, of the total TAM. Where do you think that can go? Do you think when we look at this out a couple of years, do you think you can be 15%, 20% share for total Data Center GPU or do you have aspirations to be even larger than that?

Lisa Su

Yeah, Tim. I mean, I would say that, first of all, this is an incredibly exciting market, right? I think we all see the growth in generative AI workloads. And the fact is, we're just at the very early innings of people truly adopting it for enterprise, business productivity applications. So I think we are big believers in the strength of the market. We previously said we believe that the compound annual growth rate could be 50% over the next three or four years. And so we think the market is huge and there will be multiple winners in this market.

Certainly, from our standpoint, we want to be -- we're playing to win and we think MI300 is a great product, but we also have a strong road map beyond that for the next couple of generations. And we would like to be a significant player in this market so we'll see how things play out. But overall, I would say that I am encouraged with the progress that we're making on hardware and software and certainly with the customer set.

Timothy Arcuri

Thank a lot. And then Jean, I just wanted to ask on March. I know that there's a lot of moving parts. It sounds like Data Center is up but PC is going to be down, normal seasonal and Embedded and Gaming sound down as well. So can you just help us shape sort of how to think about March? Is it down a smidge? Is it flat? Could it be up a little bit? And maybe then how to think about like first half, back half next year, if you even want to go there. Thanks.

Jean Hu

Hey, Tim. We're guiding one quarter at a time. But just to help you with some of the color, as Lisa mentioned earlier, we said the Data Center GPU revenue will be flattish sequentially. That's the first thing, right? The mix will shift from El Capitan majority in Q4 to predominantly more for AI in Q1. So that -- because of the long lead time manufacturing cycle, we feel like it's going to be a similar level of revenue with the Data Center GPU.

But in general, if you look at our business, we do have a seasonality. Typically Q1, the Client business, server business, Gaming business seasonally is down. Of course, right now, we definitely have a little bit more seasonality, given Embedded and the Gaming dynamics we are seeing right now. Server and Client typically were down sequentially, seasonally too. But overall, I think we are really focused on just execution. We probably can provide more color when we get close to Q1 2024, and especially, Lisa, please add if we have any color we can provide on the whole year 2024.

Lisa Su

Yeah. No, I think that covers it. When we look at the various pluses and minuses, I think we feel very good about the Data Center business. It continues to be a strong growth driver for us as we think about 2024 for both server as well as our MI300 clients as well, we think incrementally improves from a market standpoint as well as we believe we can gain share, given the strength of our product portfolio. And then we have the headwinds of Embedded in the inventory correction that we'll go through in the first half and the console cycle. So I think those are the puts and takes.

Operator

And our next question comes from the line of Vivek Arya with Bank of America. Please proceed with your question.

Vivek Arya

Thanks for taking my question. Lisa, on the MI300, many of your hyperscaler customers have internal ASIC solutions ready or in the process of getting them ready. So if inference is the primary workload for MI300, do you think it is exposed to replacement by internal ASICs over time or do you think both MI300 and ASICs can coexist, right, along with the incumbent GPU solution?

Lisa Su

Yeah. I think Vivek, when we look at the set of AI workloads going forward, we actually think they're pretty diverse. I mean, you have sort of the large language model training and inference then you have what you might do in terms of fine-tuning off of a foundational model and then you have, let's call it, straight inferencing what you might do there.

So I think within that framework, we absolutely believe that MI300 has a strong place in the market, and that's what our customers are telling us and we're working very closely with them. So yes, I think there will be other solutions, but I think for the -- particularly for the LLMs, I think GPUs are going to be the processing of choice and MI300's very, very capable.

Vivek Arya

Got it. And then a question, Lisa, on just this interplay between AI and traditional computing. It seems like especially when it relates to ASPs and units, seems like server CPU makers are kind of holding the line on price per core. But at the same time, the cloud players are extending the depreciation and replacement cycle of traditional server CPUs. So I'm just curious to get your take. What do you think is the interplay between units and ASP, right?

If you were to take a snapshot of what you have seen in '23 and how it kind of informs you as you look at '24, that is it possible that maybe unit growth in server is not that high but you are able to make up for it on the ASP side. So just give us some color on, one, what is happening to traditional computing deployments? And secondly, is there a difference in kind of the unit and ASP interplay on the server seat side?

Lisa Su

Yeah. I think it's a good point, Vivek. So I mean, if I take a look at 2023, I think it's been a mixed environment, right? There was a good amount of, let's call it, caution in the overall server market. There was a bit of inventory digestion at some of the cloud guys and then some optimization are going on with enterprise, again, somewhat mixed. I think as we go forward, we've returned to growth in the server CPU market.

Within that realm, because these -- like for example, 4th Gen EPYC, somewhere between 96 and 128 cores. I mean, you just get a lot of compute for that. So I do think there is the framework that unit growth may be more modest, but ASP growth, given the core count and the compute capability will contribute to overall growth. So from a traditional server CPU standpoint, I think we do see those trends. 2023 was a mixed environment and I think it improves as we go into 2024.

Operator

And the next question comes from the line of Blayne Curtis with Barclays. Please proceed with your question.

Blayne Curtis

Thanks for taking the question. I want to ask on the Embedded side. I think last quarter, you kind of talked about the headwinds being mostly in the communications end market. You're guiding it down in December. I was curious if that weakness is spread. And then your competitor talked about kind of a reset getting back to pre-pandemic levels. Just kind of curious how you framed that reset? You said it'd be weak to the first half.

Lisa Su

Yeah. Absolutely, Blayne. So I think when we look at end markets, I think communications was weak in sort of last quarter and it certainly continues to be weak. We see 5G sort of CapEx just down overall. On the other market where we see a little bit of, let's call it, soft end market demand would be industrial and that's a little bit more geographic, so a little bit worse in Europe than in other geographies.

The other end markets are actually relatively good. And what we just see is that inventory is high, just given where we were with lead times coming into the sort of through the pandemic and with the high demand that was out there. As the lead times have normalized, people are drawing down their inventories and they have an opportunity to do that, given the normalization. So from an overall standpoint, we think demand is solid. And what we view is that we have a very strong portfolio in Embedded.

We like sort of the combination of the, let's call it, the classic Xilinx portfolio together with the Embedded processing capabilities that we add. Customers have seen sort of that portfolio come together, and we've gotten some nice design win traction as a result of that. So we have to get through sort of the next couple of quarters of inventory correction, and then we believe we'll return to growth in the second half of the year.

Blayne Curtis

Thanks. And then, I just wanted to ask on the PC market. I think you and Intel have seen -- you were under-shipping in the first half. Maybe you're kind of over-shipping a little bit now, restocking. I'm just kind of curious to your perspective of what that normalized run rate is in terms of the size of the PC market and kind of any perspective if inventory levels are starting to move back up.

Lisa Su

Yeah. I would say, again, Blayne, when we looked at sort of the third quarter and sort of the environment that we’re in now, I think inventory levels are relatively normalized, and so the selling and consumption are fairly close. We were building up for a holiday season that is a strong season for us overall.

When I think about the size of the market, I think from a consumption standpoint this year is probably somewhere like 250 million to 255 million units or so. We expect some growth going into 2024 as we think about sort of the AI PC cycle and some of the Windows refresh cycles that are out there. And I think the PC market returns to, let’s call it, a typical seasonality, in which underneath that, we have a strong product portfolio. And we are very much focused on growing in places like high-end gaming, ultrathins, premium consumer as well as commercial. So those are -- that’s how sort of we see the PC market.

Operator

And the next question comes from the line of Matt Ramsay with TD Cowen. Please proceed with your question.

Matthew Ramsay

Thank you very much. Good afternoon. Lisa, I wanted to maybe ask the AI question a little bit differently, not just focused on your GPU portfolio, but more broadly. I think one of the big surprises to a lot of us is, how quickly the AI market changed from accelerator cards to selling full servers or full systems for your primary competitor. And they've done a lot of innovation not just on GPU, but on CPU on their own custom interconnect, et cetera. So what I'd like to hear a little bit of an update on is just how you think about your road map going forward across CPU, GPU and networking and particularly the networking part as you look to continue to advance your AI portfolio. Thanks.

Lisa Su

Yeah. Thanks, Matt. I think it's an important point. What we're seeing with these AI systems is they are truly complicated when you think about putting all of these components together. We are certainly working very closely with our partners in putting together sort of the full system, CPU, GPUs as well as the networking capability. Our Pensando acquisition has actually been really helpful in this area. I think we have a world-class team of experts in this area, and we're also partnered with some of the networking ecosystem overall.

So going forward, I don't think we're going to sell full systems, let's call it, AMD-branded systems. We believe that there are others who are more set up for that. But I think from a definition standpoint and when we're doing development, we are certainly doing development with the notion of what that full system will look like. And we'll work very closely with our partners to ensure that, that's well defined so that it's easy for customers to adopt our solutions.

Matthew Ramsay

Got it. Thank you for that perspective. As my second question, Jean, I wanted to dig into gross margin a little bit and just, I guess, complement you and the team on being able to guide up for the fourth quarter. Sequentially, gross margin if we, I guess, rewound the clock back to the beginning of the year and the Embedded segment would be down from the peak to where you're guiding the fourth quarter, maybe down by a third.

I wouldn't have thought gross margin would have hung in as well and grown sequentially each quarter through the year. Obviously, Client margins got better. But maybe you could walk us through some of the puts and takes on gross margin, and inside of each segment, where you're making progress because I imagine some of that progress is pretty positive underneath. Thanks.

Jean Hu

Yeah, Matt. Thank you for the question. Yes, there are a few puts and takes, especially in a mixed demand environment. So let me just comment on Q3 first. We are very pleased with our gross margin expansion sequentially, 140 basis points. As you mentioned, the Embedded segment, revenue actually declined double-digits sequentially. There are two primary drivers. The first one is definitely Data Center grew 21% sequentially, which should provide a tailwind to our gross margin.

Secondly, as we go through the inventory correction in PC market, we did encounter some headwinds in the Client segment gross margin. And in Q3, we saw very significant improvement with our client segment gross margin. I think going forward, the pace of Client segment improvement will moderate, but it will continue to drive incremental gross margin improvement in Client segment. So that really is why we are able to drive sequential growth in Q3.

And in Q4, I would say the major dynamics is with a very strong double-digit growth in Data Center business, we definitely have the tailwind, which more than offset the Embedded segment decline sequentially double-digit again. I think going forward, it's really mix, primarily mix is driving our gross margin, but we feel pretty good about second half next year when we can expand the Data Center significantly and especially, Embedded segment start to recover, we should be able to drive more meaningful gross margin improvement in second half.

Operator

And the next question comes from the line of Ross Seymore with Deutsche Bank. Please proceed with your question.

Ross Seymore

Lisa, I had a question on the MI300 side of things. When you go to market, obviously, there's been shortages this year of GPU accelerators, and so a second source is definitely needed. But beyond just providing that second source role, can you just walk us through some of the competitive advantages that the customer lists that you're going to talk about on the sixth is finding to be so attractive relative to your primary competitor?

Lisa Su

Yeah. I think there's a couple of different things, Ross. I mean, if we start with, it's just a very capable product. The way it's designed from a triplet standpoint, we have a very strong compute as well as memory capacity and memory bandwidth. In inference, in particular, that's very helpful. And the way to think about it is on these larger language models, you can't fit the model on one GPU. You actually need multiple GPUs.

And if you have more memory, you can actually use fewer GPUs to infer those models, and so it's very beneficial from a total cost of ownership standpoint. From a software standpoint, this has been perhaps the area where we've had to invest more and do more work. Our customers and partners are actually moving towards an area where they're more able to move across different hardware so really optimizing at the higher-level frameworks. And that's reducing the barrier of entry of sort of taking on a new solution.

And we're also talking very much about, going forward, what the road map is. It's very similar to our EPYC evolution. When you think about sort of the -- our closest partners in the cloud environment, we work very closely to make each generation better. So I think MI300 is an excellent product and we'll keep evolving on that as we go through the next couple of generations.

Ross Seymore

For my follow-up, I want to focus on the OpEx side of things. You guys have kept that pretty tight over the years. Jean, I just wondered what the puts and takes on that might be heading into 2024. I think you're exiting this year at about up kind of high single digits, maybe 10% year-over-year. Any sort of unique puts and takes, especially as you guys are driving for all that MI300 success as we think about OpEx generally in 2024?

Jean Hu

Yeah. Thanks for the question. Our team has done an absolutely great job in reallocating resources within our budget envelope to really invest in the most important areas in AI and the data center. We are actually in the planning process for 2024. I can comment on a very high level, given tremendous opportunities we have in AI and the Data Center, we definitely will increase both R&D investment and go-to-market investment to address those opportunities.

I think the way to think about it is our objective is to drive top line revenue growth much faster than OpEx growth, so our investment can drive long-term growth. And we also can leverage our operating model to really actually expand earnings much faster than revenue. That’s really how we think about running the company and driving the operating margin expansion.

Operator

And the next question comes from the line of Harsh Kumar with Piper Sandler. Please proceed with your question.

Harsh Kumar

Hi, Lisa. I had a strategic one for you and then somewhat of a tactical one. On the strategic side, as your key competitor is sort of getting their act together on the manufacturing technology and the nodes, would it not be feasible to think that their manufacturing cost could be significantly better, let's say, than that of yours? And so if that's the case down the line one year or two years out, I'm curious what kind of value-add offerings would AMD have to provide to a customer to keep the market share that you have in the server space, data center space and then keep that growing as well?

Lisa Su

Yes, Harsh. Maybe I should just take a step back and just talk about sort of the engagement that we have with our data center customers. When we think about sort of the EPYC portfolio and what we've been able to build over the last few generations and what we have going forward with Zen 5 and beyond, process technology is only one piece of the optimization. It's really about process technology, packaging. We're leading sort of the usage of chiplets and 2.5D and 3D integration and then when you go to architecture and design. So it's really the holistic product.

And from a pricing standpoint, actually, price is only one aspect of the conversation. Much of the conversation is on how much performance can you give me at what efficiency. So from a -- from an overall efficiency standpoint, I think we've developed fantastic products. We are working closely with our customers to ensure that we continue to evolve our overall portfolio. So I think from a value-added standpoint, it's providing the best TCO is what our customers are looking for, and that's where our road map is headed.

Going forward, I think having the CPU, the GPU, the FPGAs, the DPUs, I think it gives us actually a nice portfolio to really optimize not just on a single component basis but on sort of all of the different workloads that you need in the data center.

Harsh Kumar

Very helpful, Lisa. And then for my follow-up, a lot of folks that we talk to think that compute game is shifting completely from CPUs to GPUs. [Technical Difficulty] So it was actually very encouraging to hear you talk about your core EPYC CPUs and the traction that you're seeing with the new generation of CPUs. So I'm curious, if I was to ask you how you think the long-term growth prospects for the next, call it, two to three to four years our for your CPU business, not the GPU but the CPU business, I'm curious what the answer would be.

Lisa Su

Yeah. So look, I'm a big believer and you need all types of compute in the data center, especially when you look at the diverse set of workloads. There's a lot of excitement around AI, and we are very much clear that, that is the number one priority from a growth standpoint going forward. But the EPYC CPU business, we feel like we've consistently gained share throughout the last few years.

And even with that, we're still underrepresented in large portions of the market, right? We're underrepresented in enterprise. We've seen some nice sort of sequential growth and nice prospects there, but there's a lot more we can do in enterprise. And we're still underrepresented in cloud third-party workloads, which, again, it's a -- you have to sell through the cloud manufacturers. So I think overall, we feel good about our EPYC leadership and also our go-to-market efforts that will help us continue to grow that business in 2024 and beyond.

Mitch Haws

Operator, we have time for two more questions.

Operator

Okay. And the next question comes from the line of Stacy Rasgon with Bernstein Research. Please proceed with your question/

Stacy Rasgon

Hi, guys. Thanks for taking my questions. First, I wanted to just dial in on the Q4 guidance. If you are going to grow Data Center 50% half over half and I assume Client is up sequentially, implies Gaming and Embedded both likely down sequentially in the 20% range. I know you said double-digits. But is that right?

And if that is true, especially for Embedded, what does that mean going forward into next year? I know you said it's going to be weak in the first half. Does that mean -- I mean, is it stable at these levels or does it continue to decline through the first half until things stabilize? Just how do we think about that in the context of the guidance that you've given for Q4?

Lisa Su

Yes. Sure, Stacy. Let me take that and then Jean might add a few comments. So without getting very specific, I would say I think your comments about Data Center and Client are correct. And then from an Embedded and Gaming standpoint, we would say Embedded, think about it down similar levels sort of in the teens compared to sort of Q3 was down in the teens and Q4 will be down in the teens.

And then Gaming, from a console standpoint, we do expect that to be down a bit more than that. And then as we go into Q1, again without being -- there are lots of things that need to happen. We would expect that both gaming and embedded would be down into Q1 as well and sort of the other comments would be more around seasonality. Does that help?

Stacy Rasgon

That does help. For my follow-up, again, I wanted to ask about gross margins. So I know that they've been extending through the year, but for the full year, they're actually down. And I get the mix things and everything else. But as I look into next year, like how do I think about this because it sounds like Embedded is going to be pretty weak next year. Client is what it is.

Data Center is growing but it does feel like even if the GPUs are accretive, they're not accretive yet, and it's going to take them a while to get to be accretive. Like how much do you think you can expand gross margins year-over-year like in '24 versus '23, given the trends that we have entering the year?

Jean Hu

Yeah. Hi, Stacy. I'll say the first thing is, if you look at 2023, it's a very unusual year for the industry, right, especially the PC market. It's one of the worst down cycles during the last 3 decades. So during that kind of a down cycle, definitely, we had headwinds on gross margin side, on our Client business, which we have made significant progress in Q3 and Q4 in second half.

Going into next year, the mix primarily is the driver of our gross margin. The way to think about it is Data Center is going to be the largest incremental revenue contributor next year. And then with both Gaming and Embedded facing continued sequential decline, I think it's all about the mix. We do expect next year will improve gross margin versus 2023, especially second half. So that's how we think about it right now.

Operator

And our final question comes from the line of Christopher Rolland with Susquehanna. Please proceed with your question.

Christopher Rolland

Thanks for the question. There was an article suggesting that you guys could be interested in doing some ARM-based CPUs. I guess I'd love any thoughts that you have there on that architecture for PC. But also Apple has their M3 out now. It seems pretty robust. Qualcomm has an X Elite new chip. It was rumored NVIDIA might be doing that as well. Would love your expectations for this market. And what does that mean for the TAM for AMD moving forward?

Lisa Su

Yeah. Sure, Chris. Thanks for the question. So look, the way we think about ARM, ARM is a partner in many respects so we use ARM throughout parts of our portfolio. I think as it relates to PCs, x86 is still the majority of the volume in PCs. And if you think about sort of the ecosystem around x86 and Windows, I think it's been a very robust ecosystem. What I'm most excited about in PCs is actually the AI PC. I think the AI PC opportunity is an opportunity to redefine what PCs are in terms of productivity tool and really sort of operating on sort of user data.

And so I think we're at the beginning of a wave there. We're investing heavily in Ryzen AI and the opportunity to really broaden sort of the AI capabilities of PCs going forward. And I think that's where the conversation is going to be about. It's going to be less about what instructions that you're using and more about what experience are you delivering to customers. And from that standpoint, I think that we have a very exciting portfolio that I feel good about over the next couple of years.

Christopher Rolland

Thank you, Lisa. And one quick one on FPGA for the Data Center in particular. That was a really cool fintech win. I understand that [Technical Difficulty] AI? And could we even mix in an FPGA into the MI300 tile at some point or is there really, at this point, not an AI market for FPGA?

Lisa Su

Yeah. I mean, Chris, the way I think about sort of FPGAs in the data center, it's another compute element. We do use FPGAs or there are FPGAs in a number of the systems. I would say from a revenue contribution standpoint, it's still relatively small sort of in the near term. We have some design wins going forward that we would see that content grow but that won't be so much in 2024, that it will be beyond that.

And part of our value proposition, I think, to our data center partners is, look, whatever compute element you need, whether it's CPUs or GPUs or FPGAs or DPUs or -- we have the ability to sort of bring those components together. And that is a strong point as we think about just how heterogeneous these data centers are going forward. So thank you for that.

Operator

At this time, we have reached the end of the question-and-answer session. Now I’d like to turn the floor back over to Mitch for any closing comments.

Mitch Haws

Great, John. That concludes today's call. Thank you to everyone for joining us today.

Operator

And ladies and gentlemen, this does conclude today's teleconference. You may disconnect your lines at this time. Thank you for your participation.