Miller's Law
Working memory can only hold 7 ± 2 items at once.
Definition
In his paper The Magical Number Seven, Plus or Minus Two (1956), American psychologist George Miller identified a fundamental constraint of human cognitive processing:
Working memory can handle an average of 7 ± 2 items simultaneously.
This limit applies to “chunks” (units of information), not atomic units: a chunk may be a digit, a word, a concept, or even a phrase: depending on the subject’s level of expertise. A chess expert can memorise an entire position as a single chunk.
The key is therefore not the raw volume of information, but the number of active cognitive units at any one time.
Why it matters
This limit is universal and independent of intelligence. It constrains the design of any system intended to be used or understood by humans:
In interface design: menus should not exceed 7 ± 2 items. Dashboards with more than 9 indicators saturate working memory and degrade decision-making.
In communication: a presentation slide with 12 bullet-pointed messages forces the audience into excessive cognitive effort, and they remember less. The “one idea per slide” principle follows directly from Miller.
In teaching: effective teachers and trainers sequence information into small blocks, with pauses and summaries, to respect the limits of working memory.
In management: a manager simultaneously tracking more than 7 projects or 7 direct reports is operating beyond their optimal cognitive capacity.
Concrete examples
Phone numbers: worldwide, numbers are formatted in groups of 2 to 4 digits, precisely to make them memorable as chunks.
Postal codes and IBANs: the same block-grouping logic applies, respecting Miller’s limit.
Shopping lists: beyond 7 items, we reach for paper or an app, not out of laziness, but because working memory is saturated.
Touch interfaces: an iOS tab bar is limited to 5 elements, an Android toolbar to 5-6. This is a deliberate choice.
Counter-measures: group information into meaningful chunks, limit lists and menus to 5-7 items, use hierarchies to defer details, and exploit external memory (writing, visuals) to offload working memory.
Working memory is not a database. It is a processing space: limited and precious.