Nvidia CEO Jensen Huang Claims AI Chips Outpace Moore's Law, Promises Cheaper AI Models

January 8, 2025
Nvidia CEO Jensen Huang Claims AI Chips Outpace Moore's Law, Promises Cheaper AI Models
  • He emphasized that today's Nvidia AI chips are 1,000 times more powerful than those produced a decade ago, showcasing significant advancements in technology.

  • During a recent interview with TechCrunch, Nvidia CEO Jensen Huang claimed that the performance of Nvidia's AI chips is advancing at a pace that surpasses Moore's Law, which traditionally predicts the doubling of transistors on chips every year.

  • At the CES keynote, which attracted around 10,000 attendees, Nvidia showcased its efforts to make AI reasoning models, such as OpenAI's o3, more affordable by enhancing computing capabilities.

  • He noted a trend of decreasing prices for AI models, attributing this to improvements in Nvidia's hardware, and expressed optimism that this trend will persist despite the initial high costs of models like OpenAI's o3.

  • Maxwell Zeff, the reporter covering this story, specializes in AI and emerging technologies, bringing a credible perspective to Huang's insights.

  • Huang's strategy for chip development is comprehensive, integrating architecture, chip design, systems, libraries, and algorithms to drive faster innovation in the AI sector.

  • Huang introduced the concept of 'hyper Moore's Law' in the AI sector, identifying three active scaling laws: pre-training, post-training, and test-time compute, with the latter being crucial during the inference phase.

  • Huang believes that advancements in AI inference will not only lower costs but also enhance performance, reflecting historical trends associated with Moore's Law.

  • Huang's statements were made following his keynote address at CES on January 7, 2025, highlighting the rapid evolution of AI technology.

  • As AI companies like Google, OpenAI, and Anthropic rely heavily on Nvidia's technology, Huang's performance claims are particularly significant, especially as the industry shifts focus from training to inference.

  • While Nvidia's chips, especially the H100, have been vital for AI model training, there are growing concerns about their affordability for inference as tech companies adjust their strategies.

  • Huang expressed confidence that as Nvidia continues to innovate in hardware, the cost of running AI models will decrease, benefiting the entire industry.

Summary based on 4 sources


Get a daily email with more Tech stories

More Stories