The Framing Effect
The same data produces opposite decisions depending on how it is presented.
Definition
The framing effect is a cognitive bias described by Daniel Kahneman and Amos Tversky in 1981, in their famous “Asian disease problem”. It refers to the phenomenon whereby a decision changes depending on whether information is presented from a positive or negative angle, even when the underlying reality is strictly identical.
“Save 200 out of 600 people” and “let 400 out of 600 people die” are objectively equivalent. But people choose differently depending on the wording.
Why it matters
The framing effect infiltrates every domain where decisions are made from information:
In medicine: “this treatment has a 90% survival rate” generates far more acceptance than “this treatment has a 10% mortality rate”. Even doctors are not immune to this bias.
In finance: “investment with a 30% chance of gain” vs “investment with a 70% chance of loss” describe the same product, but produce radically different decisions.
In politics: the same statistics on immigration, employment, or crime can be framed positively or negatively to steer opinion. Framing is often more powerful than the data itself.
In marketing: “contains 90% natural ingredients” vs “contains 10% additives”, same product, radically different perception.
Concrete examples
The Asian disease problem (Kahneman & Tversky, 1981): faced with an epidemic affecting 600 people, 72% chose Programme A (“200 people saved”) versus 28% for Programme B (“400 people will die”): even though both programmes have the same expected outcome.
Food labels: “75% meat” vs “25% fat”, consumers rate the first significantly better, even though it’s the same product.
Crisis communication: “we have resolved 90% of incidents” vs “10% of incidents are still unresolved”, the same operational reality can be framed as a success or a failure.
Insurance policies: exclusions and excesses presented positively (“you are covered for…”) generate more uptake than the list of uncovered cases.
Counter-measures: deliberately reframe each important decision from multiple angles, create checklists that impose a negative perspective (“what could go wrong?”), and practise identifying framing in every communication received.
It’s not reality that decides: it’s the word chosen to describe it.