rw-book-cover

Metadata

Highlights

  • Google CEO Sundar Pichai is saying sayonara to the era of easy AI advancements. (View Highlight)
  • At the New York Times Dealbook Summit last week, Pichai sat down to discuss one of the biggest questions hanging over the tech industry right now: whether or not current techniques for improving AI models are reaching their limit. (View Highlight)
  • He was a little mealy-mouthed when it came to directly addressing the technical side of the issue, but was unequivocal that the effortless newbie gains that AI development initially enjoyed are over. (View Highlight)
  • “I think the progress is going to get harder,” Pichai said at the Dealbook summit. “When I look at ‘25, the low-hanging fruit is gone. The hill is steeper.” (View Highlight)
  • Fears that generative AI improvements were hitting a wall came to a head last month, as reports trickled out that OpenAI researchers discovered that the company’s upcoming large language model, code-named Orion, demonstrated significantly less improvement — and in some cases no clear advances at all — than previous iterations did over their predecessors. (View Highlight)
  • This lent credence to the suspicion that bolstering AI models by adding more data and computing power, or “scaling,” was finally showing diminishing returns as many experts had predicted. In response, OpenAI CEO Sam Altman smugly dismissed those claims, tweeting “there is no wall,” while others in the organization have hyped up its AI capabilities even further. (View Highlight)
  • Pichai says he doesn’t “fully subscribe to the wall notion” himself — but agrees that AI developers will have to stop relying on scaling. (View Highlight)
  • “When you start out quickly scaling up you can throw more compute and make a lot of progress,” the Google CEO said at the NYT event. “We are definitely going to need deeper breakthroughs as we go to the next stage.” (View Highlight)
  • “I’m very confident there will be a lot of progress in ‘25,” he added. “I think the models are definitely going to get better at reasoning, completing a sequence of actions more reliably — more agentic if you will.” (View Highlight)
  • “The current amount of compute we’re using is just an arbitrary number. It’s not like we’re using a lot of compute,” he said. “There’s no reason why we can’t just keep scaling up.” (View Highlight)
  • This is a striking thing to assert, because the largest AI models, including Google’s, are notorious for devouring an ungodly amount of power — so much that both Google and Microsoft are firing up nuclear power plants just to meet its energy demands. (View Highlight)
  • Meanwhile, the highly coveted AI chips that the industry depends on are so in demand that manufacturer Nvidia can barely keep up. These all seem like obvious obstacles to why you can’t keep scaling up forever. (View Highlight)
  • Nonetheless, Pichai concedes that scaling on its own won’t be enough. What will prove to be the differentiators will be “technical” and “algorithmic” breakthroughs, he said. (View Highlight)