NVIDIA’s GPUs powered the AI revolution. Its new Blackwell chips are up to 30 times faster


In less than two years, NVIDIA’s H100 chips, used by nearly every AI company in the world to develop the large language models that power services like ChatGPT, have made it one of the best in the world. most valuable companies. On Monday, NVIDIA announced a new generation platform called Blackwell, whose chips are 7-30 times faster than H100 and use 25 times less power.

“Blackwell GPUs are the engine powering this new Industrial Revolution,” NVIDIA CEO Jensen Huang said at the company’s annual GTC event in San Jose, attended by thousands of developers. some compared To Taylor Swift’s concert. “General artificial intelligence is the defining technology of our time. Working with the world’s most dynamic companies, we will realize the promise of artificial intelligence for every industry,” Huang added. press release.

NVIDIA’s Blackwell chips are named after David Harold Blackwell, a mathematician who specialized in game theory and statistics. NVIDIA claims that Blackwell is the most powerful chip in the world. With a speed of 20 petaflops compared to just 4 petaflops offered by the H100, AI offers companies a significant performance boost. Much of this speed is made possible by the 208 billion transistors on the Blackwell chips, versus 80 billion on the H100. To achieve this, NVIDIA combined two large chips that can talk to each other at 10 terabytes per second.

In a sign of how much our modern artificial intelligence revolution depends on NVIDIA chips, the company said in a press release. includes testimonials from seven CEOs who collectively lead multi-trillion dollar companies. These include OpenAI CEO Sam Altman, Microsoft CEO Satya Nadella, Alphabet CEO Sundar Pichai, Meta CEO Mark Zuckerberg, Google DeepMind CEO Demis Hassabis, Oracle Chairman Larry Ellison, Dell include CEO Michael Dell and Tesla CEO Elon Musk.

“There’s nothing better than NVIDIA hardware for artificial intelligence right now,” Musk said. “Blackwell offers huge performance leaps and will accelerate our ability to deliver cutting-edge models. We’re excited to continue working with NVIDIA to advance AI computing,” said Altman.

NVIDIA did not say how much the Blackwell chips will cost. Its H100 chips currently run between $25,000 and $40,000 per chip. according to for CNBCand entire systems equipped with these chips can cost $200,000.

Despite their costs, NVIDIA’s chips are in high demand. Last year it was delivery wait times as high as 11 months. And getting access to NVIDIA’s AI chips is increasingly seen as a status symbol for tech companies looking to attract AI talent. Earlier this year, Zuckerberg it sounded the company’s efforts to build “massive infrastructure” to power Meta’s AI efforts. “By the end of this year,” Zuckerberg wrote, “we’ll have ~350k Nvidia H100s — and a total of ~600k H100s H100 equivalent compute.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *