By Wallace Witkowski
Grace Hopper GH200 CPU+GPU follows AMD’s MI300X announcement in June
You didn’t think Nvidia Corp. would let the lead AI data-center product of Advanced Micro Devices Inc., its closest rival in the artificial-intelligence data-center chip market, go unanswered, did you?
At 2023’s Special Interest Group on Computer Graphics and Interactive Techniques, or SIGGRAPH, conference, Nvidia (NVDA) founder and Chief Executive Jensen Huang unveiled his answer to AMD’s (AMD) Insight MI300X CPU + GPU, which that company has been teasing all year.
In his keynote address Tuesday, Huang introduced the next-generation DGX GH200 Grace Hopper Superchip, for use with large-memory generative-AI models like OpenAI’s ChatGPT — which is backed by Microsoft Corp. (MSFT) — "to scale out the world’s data centers."
Nvidia shares, which had been down about 1% before the announcement, dropped as much as 3% to an intraday low of $440.56 following the announcement, and finished down 1.7% at $446.64.
AMD shares closed down 3.1% at $113.23. Meanwhile, the PHLX Semiconductor Index SOX fell 1.6%, the S&P 500 SPX declined 0.4% and the tech-heavy Nasdaq Composite COMP dropped 0.8%.
In a press conference ahead of the announcement, Nvidia’s head of hyperscale and high-performance computing, Ian Buck, told reporters the GH200 packs more memory and more bandwidth than the company’s H100-based data-center system. The GH200 uses Nvidia’s Hopper GPU and marries it with its Arm Ltd. architecture-based Grace CPU. The chip carries 141 GB of HBM3 memory and 5 TB per second of bandwidth.
The GH200 can be doubled up in the NVLink-dual GH200 system to increase capacity by 3.5 times and triple bandwidth. Both will be available in the second quarter of 2024, but Nvidia did not comment on pricing.
Buck said a vast majority of AI training and inferencing is done on Nvidia’s current HGX systems. The GH200 offers a new option for inference customers to support AI workloads, at two times the performance per watt, as cloud-service providers hope to build out their capacity without significantly increasing their energy costs.
In addition to Microsoft’s Azure cloud-service provider, other companies with hyperscaler capacity, like Amazon.com Inc.’s (AMZN) giant Amazon Web Services and Alphabet Inc.’s (GOOGL)(GOOGL) Google Cloud Platform, are expected to fuel sales of AI chips in buildouts in the second half of the year.
Read: Chip-equipment suppliers rally after Lam says AI servers will drive growth
On Friday, AMD broke with the broad tech selloff to finish higher on the week as analyst support gathered for the chip maker’s AI position following its earnings report, in which AMD Chair and CEO Lisa Su forecast "multiple winners" in the AI race. Nvidia reports its earnings after the market close on Aug. 23.
Read: Will AI do to Nvidia what the dot-com boom did to Sun Microsystems? Analysts compare current hype to past ones.
AMD introduced its MI300X CPU + GPU at its AI product launch in June. The chip maker is regarded as a distant second to Nvidia when it comes to AI data-center hardware market share.
Read: Nvidia gets more good news from Big Tech, even as AI spending ‘may not lift all boats’
Year to date, AMD shares have gained 74.8%, while Nvidia shares have soared more than 205% and the SOX index has rallied 45.3%. The S&P 500 has advanced 17.2% and the Nasdaq has grown 327% in the same time frame.
Read: Nvidia ‘should have at least 90%’ of AI chip market with AMD on its heels
-Wallace Witkowski
This content was created by MarketWatch, which is operated by Dow Jones & Co. MarketWatch is published independently from Dow Jones Newswires and The Wall Street Journal.
(END) Dow Jones Newswires
08-08-23 1722ET
Transparency is how we protect the integrity of our work and keep empowering investors to achieve their goals and dreams. And we have unwavering standards for how we keep that integrity intact, from our research and data to our policies on content and your personal data.
We’d like to share more about how we work and what drives our day-to-day business.
We sell different types of products and services to both investment professionals and individual investors. These products and services are usually sold through license agreements or subscriptions. Our investment management business generates asset-based fees, which are calculated as a percentage of assets under management. We also sell both admissions and sponsorship packages for our investment conferences and advertising on our websites and newsletters.
How we use your information depends on the product and service that you use and your relationship with us. We may use it to:
To learn more about how we handle and protect your data, visit our privacy center.
Maintaining independence and editorial freedom is essential to our mission of empowering investor success. We provide a platform for our authors to report on investments fairly, accurately, and from the investor’s point of view. We also respect individual opinions––they represent the unvarnished thinking of our people and exacting analysis of our research processes. Our authors can publish views that we may or may not agree with, but they show their work, distinguish facts from opinions, and make sure their analysis is clear and in no way misleading or deceptive.
To further protect the integrity of our editorial content, we keep a strict separation between our sales teams and authors to remove any pressure or influence on our analyses and research.
Read our editorial policy to learn more about our process.
© Copyright 2023 Morningstar, Inc. All rights reserved. Dow Jones Industrial Average, S&P 500, Nasdaq, and Morningstar Index (Market Barometer) quotes are real-time.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.