Table of Contents
Fetching ...

How AI Aggregation Affects Knowledge

Daron Acemoglu, Tianyi Lin, Asuman Ozdaglar, James Siderius

Abstract

Artificial intelligence (AI) changes social learning when aggregated outputs become training data for future predictions. To study this, we extend the DeGroot model by introducing an AI aggregator that trains on population beliefs and feeds synthesized signals back to agents. We define the learning gap as the deviation of long-run beliefs from the efficient benchmark, allowing us to capture how AI aggregation affects learning. Our main result identifies a threshold in the speed of updating: when the aggregator updates too quickly, there is no positive-measure set of training weights that robustly improves learning across a broad class of environments, whereas such weights exist when updating is sufficiently slow. We then compare global and local architectures. Local aggregators trained on proximate or topic-specific data robustly improve learning in all environments. Consequently, replacing specialized local aggregators with a single global aggregator worsens learning in at least one dimension of the state.

How AI Aggregation Affects Knowledge

Abstract

Artificial intelligence (AI) changes social learning when aggregated outputs become training data for future predictions. To study this, we extend the DeGroot model by introducing an AI aggregator that trains on population beliefs and feeds synthesized signals back to agents. We define the learning gap as the deviation of long-run beliefs from the efficient benchmark, allowing us to capture how AI aggregation affects learning. Our main result identifies a threshold in the speed of updating: when the aggregator updates too quickly, there is no positive-measure set of training weights that robustly improves learning across a broad class of environments, whereas such weights exist when updating is sufficiently slow. We then compare global and local architectures. Local aggregators trained on proximate or topic-specific data robustly improve learning in all environments. Consequently, replacing specialized local aggregators with a single global aggregator worsens learning in at least one dimension of the state.

Paper Structure

This paper contains 32 sections, 13 theorems, 206 equations, 2 figures.

Key Result

Proposition 1

Suppose that $T$ is strongly connected and aperiodic. Then, the augmented transition matrix $\Gamma$ is strongly connected and aperiodic if: (i) ${ \if@compatibility \mathchar"011A {} \mathchar"011A } \in (0,1)$, (ii) ${ \if@compatibility \mathchar"010C {} \mathchar"010C }_i < 1$ for all

Figures (2)

  • Figure 1: Global aggregator architecture.
  • Figure 2: Local aggregator architecture.

Theorems & Definitions (13)

  • Proposition 1
  • Theorem 2
  • Theorem 3
  • Proposition 4
  • Proposition 5
  • Proposition 6
  • Theorem 7
  • Proposition 8
  • Lemma 9
  • Lemma 10
  • ...and 3 more