AI Speed, Nuclear, Quantum

AI Speed

AI continues to develop globally at an extremely rapid pace. AI and LLMs are software-based, which means they can be iterated on, and updates can be deployed in real-time. It’s a challenge to keep up with even weekly AI-related developments. This talk from April 5th with Eric Schmidt is worth listening to. Here are a few takeaways:

  • The roughly $250 million cost of training the next-generation LLM model will mainly be spent on electricity.
  • Don’t underestimate scale; consider the market opportunity for billions of users using the many possible AI applications.
  • Eric’s ultimate AI-driven education vision is “an AI tutor for anyone in the world at any education level in their language for free on their phone that can adapt to that person’s learning style, attention span, etc.” This is one area I am hugely optimistic about, not necessarily today but certainly within the next 5-10 years. “Think about a world where everyone is educated to the maximum they can be.” We don’t have large data sets today on how people learn, but once we build them, we can train models to adapt and customize to unique learning styles.
  • All famous people with large digital footprints will be cloned by AI and live on forever if there is enough data to create the model. If I write enough of these letters, I, too, will be clone-able.
  • As a final comment on values, Eric asks people to continue to build AI with positive human values. Eric noted at the beginning of the talk that social media is not driven by morals or values but instead by revenue and attention.
  • Consider the impact of AI leading to a doubling of productivity. We don’t have economic models for what happens when productivity doubles. We happened to write about this topic last week.
  • Eric touched on quantum computing at the end, but I didn’t hear what I hoped to hear more about. I was hoping to hear about quantum-supported AI model training. It’s my understanding that quantum infrastructure applied to AI model training will result in an order-of-magnitude improvement in cost, time and scope. Instead of Gemini requiring hundreds of millions, multiple complete data centers and months of training, the model would be trained on a single machine in roughly an hour. Perhaps practical quantum AI applications are like fusion, always teasing, likely to happen, just around the corner, but realistically not yet ready. Google, Microsoft and Amazon have been working on quantum computing solutions for a long time.

Aside from this interview, there is a lot of coverage of the energy costs of AI. The FT has an article that the constraining factor for future AI growth will be the electricity grid itself. Sam Altman planted this idea recently during one of his many interviews. It implies that the demand for AI is nearly unlimited or beyond our global capacity. That statement likely has some truth but is also just a great marketing tactic.

On this same topic, a nuclear power plant in Michigan recently became the first plant to be re-commissioned in the United States. The plant was shut down in 2022 for financial reasons. This event could mark a turning point in nuclear power adoption. More political and social support for nuclear energy is needed. I didn’t see any AI or data center news related to this Michigan plant, but nuclear and AI are natural fits.

Weekly Articles by Osbon Capital Management:

"*" indicates required fields