Table of Contents
Fetching ...

Generative AI for material design: A mechanics perspective from burgers to matter

Vahidullah Tac, Ellen Kuhl

Abstract

Generative artificial intelligence offers a new paradigm to design matter in high-dimensional spaces. However, its underlying mechanisms remain difficult to interpret and limit adoption in computational mechanics. This gap is striking because its core tools-diffusion, stochastic differential equations, and inverse problems-are fundamental to the mechanics of materials. Here we show that diffusion-based generative AI and computational mechanics are rooted in the same principles. We illustrate this connection using a three-ingredient burger as a minimal benchmark for material design in a low-dimensional space, where both forward and reverse diffusion admit analytical solutions: Markov chains with Bayesian inversion in the discrete case and the Ornstein-Uhlenbeck process with score-based reversal in the continuous case. We extend this framework to a high-dimensional design space with 146 ingredients and 8.9x10^43 possible configurations, where analytical solutions become intractable. We therefore learn the discrete and continuous reverse processes using neural network models that infer inverse dynamics from data. We train the models on only 2,260 recipes and generate one million samples that capture the statistical structure of the data, including ingredient prevalence and quantitative composition. We further generate five new burgers and validate them in a restaurant-based sensory study with 100 participants, where three of the AI-designed burgers outperform the classical Big Mac in overall liking, flavor, and texture. These results establish diffusion-based generative modeling as a physically grounded approach to design in high-dimensional spaces. They position generative AI as a natural extension of computational mechanics, with applications from burgers to matter, and establish a path toward data-driven, physics-informed generative design.

Generative AI for material design: A mechanics perspective from burgers to matter

Abstract

Generative artificial intelligence offers a new paradigm to design matter in high-dimensional spaces. However, its underlying mechanisms remain difficult to interpret and limit adoption in computational mechanics. This gap is striking because its core tools-diffusion, stochastic differential equations, and inverse problems-are fundamental to the mechanics of materials. Here we show that diffusion-based generative AI and computational mechanics are rooted in the same principles. We illustrate this connection using a three-ingredient burger as a minimal benchmark for material design in a low-dimensional space, where both forward and reverse diffusion admit analytical solutions: Markov chains with Bayesian inversion in the discrete case and the Ornstein-Uhlenbeck process with score-based reversal in the continuous case. We extend this framework to a high-dimensional design space with 146 ingredients and 8.9x10^43 possible configurations, where analytical solutions become intractable. We therefore learn the discrete and continuous reverse processes using neural network models that infer inverse dynamics from data. We train the models on only 2,260 recipes and generate one million samples that capture the statistical structure of the data, including ingredient prevalence and quantitative composition. We further generate five new burgers and validate them in a restaurant-based sensory study with 100 participants, where three of the AI-designed burgers outperform the classical Big Mac in overall liking, flavor, and texture. These results establish diffusion-based generative modeling as a physically grounded approach to design in high-dimensional spaces. They position generative AI as a natural extension of computational mechanics, with applications from burgers to matter, and establish a path toward data-driven, physics-informed generative design.

Paper Structure

This paper contains 18 sections, 49 equations, 11 figures, 2 tables.

Figures (11)

  • Figure 1: Three-ingredient burger problem. Three-ingredient space $[\, x_\text{bun}, x_\text{patty}, x_\text{cheese} \;]$ with patty, bun, and cheese (left); with each ingredient either present or absent, $\hbox{\boldmath $x$}{} = [\, x_\text{bun}, x_\text{patty}, x_\text{cheese}\, ] \in \{0,1\}^3$, generating $2^3=8$ eight possible burgers (middle); training data with cheeseburger $\hbox{\boldmath $x$}{}_1 = [\,1,1,1\,]$ and hamburger $\hbox{\boldmath $x$}{}_2 = [\,1,1,0\,]$ (right).
  • Figure 2: Discrete modeling of forward diffusion. Three-ingredient space with eight possible burgers with training data, cheeseburger $\hbox{\boldmath $x$}{}_1 = [\,1,1,1\,]$ and hamburger $\hbox{\boldmath $x$}{}_2 = [\,1,1,0\,]$ with equal probabilities $p_{\text{data}}(\hbox{\boldmath $x$}{}_1) = 0.50$ and $p_{\text{data}}(\hbox{\boldmath $x$}{}_2) = 0.50$, highlighted in color (left); forward diffusion with independent equal-probabilty flipping of the three ingredients gradually diffuses probabilities across the cube (middle); converged state of maximum entropy with equal probabilities $p_{\infty} =\frac{1}{8}$ across all eight burgers (right).
  • Figure 3: Discrete modeling of reverse diffusion. Noised data with maximum entropy and equal probabilities $p_{\infty} =\frac{1}{8}$ of all eight burgers (left); reverse diffusion with probability-weighted flipping of the three ingredients gradually diffuses probabilities against their gradients towards the training set (middle); de-noised data with three possible burgers, cheeseburger $\hbox{\boldmath $x$}{}_1 = [\,1,1,1\,]$ and hamburger $\hbox{\boldmath $x$}{}_2 = [\,1,1,0\,]$ and newly discovered cheese sandwich $\hbox{\boldmath $x$}{}_3 = [\,0,1,1\,]$ with probabilities $p_{\text{data}}(\hbox{\boldmath $x$}{}_1) = 0.50$ and $p_{\text{data}}(\hbox{\boldmath $x$}{}_2) = 0.49$ and $p_{\text{data}}(\hbox{\boldmath $x$}{}_3) = 0.01$, highlighted in color (right).
  • Figure 4: Forward diffusion in discrete ingredient space. Forward diffusion noises the training data to obtain a uniform distribution. Starting from two training states, cheeseburger and hamburger, probability mass spreads across all states and approaches a uniform distribution as reflected by the convergence of individual state probabilities (left), homogenization of the probability distribution (center), and increase in Shannon entropy (right).
  • Figure 6: Generating new burgers by discrete diffusion. Sampling complexity for discovering a new burger, a cheese sandwich $[\,1,0,1\,]$, with Hamming distances of $d$ = 2 and $d$ = 1 from the training data, hamburger $[\,1,1,0\,]$ and cheeseburger $[\,1,1,1\,]$. Probability of endpoint discovery $p_{\text{end}}$ increases with flip probability $\beta$ and saturates at the uniform limit $p_{\infty} = \frac{1}{8}$ (left); probability of pathwise discovery $p_{\text{path}}$ increases from an initial small-$\beta$ scaling, $p_{\text{path}} \approx {\mathcal{O}}(\beta)$, and approach unity as trajectories explore the full state space (middle); number of sample trajectories required for 95% discovery $N_{95}$ decreases rapidly with increasing flip probability $\beta$ (right).
  • ...and 6 more figures