login
Home / Papers / Thermodynamic AI and the Fluctuation Frontier

Thermodynamic AI and the Fluctuation Frontier

14 Citations2023
Patrick J. Coles
2023 IEEE International Conference on Rebooting Computing (ICRC)

This work identifies stochastic units (s-units) as the building blocks for Thermodynamic AI hardware, and employs a Maxwell’s demon device that guides the system to produce non-trivial states.

Abstract

Many Artificial Intelligence (AI) algorithms are inspired by physics and employ stochastic fluctuations. We connect these physics-inspired AI algorithms by unifying them under a single mathematical framework that we call Thermodynamic AI, including: (1) Generative diffusion models, (2) Bayesian neural networks, (3) Monte Carlo sampling and (4) Simulated annealing. Such Thermodynamic AI algorithms are currently run on digital hardware, ultimately limiting their scalability and overall potential. Stochastic fluctuations naturally occur in physical thermodynamic systems, and such fluctuations can be viewed as a computational resource. Hence, we propose a novel computing device, called Thermodynamic AI hardware, that could accelerate such algorithms. We contrast Thermodynamic AI hardware with quantum computing where noise is a roadblock rather than a resource. Thermodynamic AI hardware can be viewed as a novel form of computing, since it uses a novel fundamental building block. We identify stochastic units (s-units) as the building blocks for Thermodynamic AI hardware. In addition to these s-units, Thermodynamic AI hardware employs a Maxwell’s demon device that guides the system to produce non-trivial states. We provide a few simple physical architectures for building these devices.