For the past 50 years, computing power has doubled approximately every 18 months. It’s a phenomenon known as Moore’s Law and it’s driven by cramming ever more transistors onto silicon chips. Today, top-of-the-line computer chips that are used to train AI models behind apps like ChatGPT contain more than 200 billion of these microscopic switches, and many experts predict that we may see trillion-transistor GPUs within the decade.
Better chips have been key to developing new AI systems that have already led to dramatic advances in health care, education, and virtually every other area of modern life. But as the number of transistors on a chip continues to grow, so do the challenges. More transistors require more energy to run the chip, and more energy coursing through the chip generates more waste heat. The amount of energy required to power and cool the thousands of chips in planned data centers now rivals the energy usage of entire cities and has led AI companies to look to drastic solutions such as building nuclear reactors next to data centers or launching data centers into space to meet the energy demands.
For Colgate physics professor Ken Segall, the solution to AI’s energy crisis requires rethinking the way chips are built from the ground up. For nearly two decades, he has been working on a radically different approach to AI known as superconducting neuromorphic computing. This involves creating chips that operate just a few degrees above absolute zero and function similarly to a biological brain. Read more in Colgate Magazine.