As I have mentioned in previous posts I have been reading "Thinking, Fast and Slow" by Nobel laureate Daniel Kahneman.
The book is very broad in its coverage of interesting topics. Today I want to write a bit about the concept of "public policy risk."
Kahneman covers this in his book and I think there are some important ideas there though I am not sure I agree with his conclusions.
Public policy and risk go hand in hand. Why are their extra handrails on the stairs, why are children pushed to have vaccinations before entering school, why do we have double yellow (no passing) lines on a highway where we are blind to on-coming traffic?
All of these are public policy solutions to a perceived "risk."
The risk of someone falling, the risk of a child having a disease that spreads, the risk of pulling out to pass and being struck by another car.
These are things that the government wants you to be "protected from" - so they (the government) create policies to reduce the perceived risk that these bad things will happen.
One of the things Kahenman discusses is a fellow researcher named Paul Slovic. Slovic has spent a great deal of time studying public policy risk and his conclusions are quite interesting.
To quote for Kahneman's book, Slovic concludes that:
“Risk” does not exist “out there,” independent of our minds and culture, waiting to be measured. Human beings have invented the concept of “risk” to help them understand and cope with the dangers and uncertainties of life. Although these dangers are real, there is no such thing as “real risk” or “objective risk. (Kahneman, Daniel (2011-10-25). Thinking, Fast and Slow (p. 141). Macmillan. Kindle Edition.)
Interesting - no such thing as risk...
Another point Slovic makes is that things like press coverage of unusual events dramatically color people's perception of risk.
For example, various examples are given where very unlikely events, such as botulism poisoning, are juxtaposed to comment events, say death in a car accident. The unlikely events, which often receive extensive press coverage, are often to believed to be much more likely than they really are.
I touched on this back a few months ago in "The Risk of the Cure."
People tend to panic over very unlikely things and to even make extensive investments in things which will have a very small appeal to the broader environment.
So what is risk then?
According to Wikipedia risk has many potential meanings, for example, "the effect of uncertainty on objectives" or the probability of uncertain events.
But events exist independently of any human attempt to quantify them.
Lightening strikes the earth all the time - the presence of humans has little or no effect on this.
But human's create the notion of "the risk of being struck by lightening" as a means to relate this natural and unpredictable event to themselves and their interests.
I think that one of the interesting things not said by Kahneman, which in general, at least so far, I see as an overall criticism of this book, is that relative to the human mind risk is not based on "probability" at all.
Instead the human mind sees "risk" as "difference."
Your mind views the world in from a certain perspective - things ordered in a certain way with some set of rules that your mind perceives as the "proper order" of things. When something changes that's important to your mind.
In Kahneman's terms "System 1" detects a difference which raises an alarm.
Now, throughout non-technical human history, i.e., before about 1400 or so, this was an adequate means for functioning in the world. When things were "different" they required checking into: more rain than "normal", less rain than "normal", growling at the edge of the forest, unusual symptoms, unusual actions, and so on.
You can see this today with any pack of dogs. When a single dog detects a change, i.e., someone coming down the driveway, he barks. Why? Because the "System 1" in his brain says "I see something different" so I must alert my pack-mates because difference can equal danger.
Humans as I see it are exactly the same - your mind is interested in the "something different" part of the world.
The "everyday same" part does not cause alarm at all - its what's expected.
Its only the unexpected that matters - when the brain detects what is not ordinary or normal.
Kahneman goes on to discuss the "availability cascade." According to the book "... availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action." (Kahneman, Daniel (2011-10-25). Thinking, Fast and Slow (p. 142). Macmillan. Kindle Edition. )
The "minor event" is an event seen by the public as one involving an amount of risk the public may not wish to take, e.g., water pooling around the base of a dam or a crack in a bridge support.
The "cascade" Kahneman refers to is really what I call a "witch hunt."
Kahneman limits his discussion of the public's reaction to things based on public policy issues but in fact I believe its much wider.
The notion of a witch hunt is exactly the same. A small outlying incident, basically one which is, for example, not easily explained, i.e., attributed to witchcraft, causes panic to spread ultimately resulting in government policy. Just like someone showing up in town after seeing water at the base of the dam (the water may not indicate the dam is leaking but as news spreads that becomes irrelevant). No mother want's her children washed away in a flood when taking some form of action could easily prevent it.
Dogs react in a similar way to an unusual event - sometimes bringing the entire pack into a frenzy. For example, an unrecognized car comes down the driveway and then an unrecognized person emerges. The pack turns into a snarling mob against itself as panic spreads.
So for me most risk, and particularly public policy risk, is simply a perception by some small portion of the public that turns into a widespread panic or issue.
Now what's interesting is that often, based on misguided intentions or a poor collection of basic facts, the actions taken by government or the public have little to do with the actual supposed risk.
And often risk is not assessed where it should be, i.e., the dam that burst causing the flood. Sometimes a small percentage of the public sees the actual risk but often its the case that "no one saw it coming" as in "the big one that wipes out California."
Kahneman's mistake with risk, I think, and it also relates to some future posts I plan, is that logic and probability are tools created by man in an attempt to scientifically and mathematically quantify the risk of the "unknown." But they do not define risk as a human mind does. Differences and their potential risk in the environment are not based on calculations or science because the mind was not developed in that way. Instead minds evolved (in humans and dogs) sitting around and noticing a meaningful difference in the world.
Those that "saw it coming" and were "prepared" were rewarded with a better chance of survival.
Before 1400 or so people basically had only their own senses to assess risk with. Certainly there was some historical aspects, i.e., the river floods every year, but risk of whether the wolf burst out of the woods to snatch a child could only be mitigated by vigilance, and, in particular, the detection of "differences" from the norm, i.e., the forest became very quiet before the attack.
So I think that Kahneman is attempting to some degree to force a scientific notion of logic and risk onto what is not a logic nor mathematically driven human process.
To some degree the human mind is still using the notion that we are sitting around a fire in the dark and danger lurks outside of the light cast by the fire. The mind repeating "What's different?" over and over and looking for signs of danger.
So risk then is an scientific attempt to quantify danger with some mathematical formula, i.e., the chance the dam will break is .001%. However, if water is pooling at the base of the dam and I live downstream my mind will not be happy with the fact that "its just rainwater collecting for a larger-than-normal" run of rainy days. My mind (Kahneman's "System 1") says - "no way - water at the base of the dam = bad news for my survival."
The notion of intuition, e..g, "woman's intuition" falls along these lines. Your mind may perceive cues from the environment which don't fully penetrate your conscious mind (and Kahneman devotes a large section of the book to this idea as "priming").
But science has no ideas what cues trigger this mental "System 1" so how can it really assess what human's due with risk?
So while scientists and public policy makers attempt to quantify known risks, i.e., the probability the dam will fail if built badly, this is on the whole a different thing entirely than the perception the mind makes about risk at any given moment.
And at least so far this is the other fatal flaw in Kahneman's book - that his own notions of things - his very experiments and means of collecting data, for example, are as biased as those of the participants in his studies. Similarly, his notion that somehow science is going to understand how and why human's make decisions really isn't too meaningful if it only measures superficial results and does not probe further.
The book is very broad in its coverage of interesting topics. Today I want to write a bit about the concept of "public policy risk."
Kahneman covers this in his book and I think there are some important ideas there though I am not sure I agree with his conclusions.
Public policy and risk go hand in hand. Why are their extra handrails on the stairs, why are children pushed to have vaccinations before entering school, why do we have double yellow (no passing) lines on a highway where we are blind to on-coming traffic?
All of these are public policy solutions to a perceived "risk."
The risk of someone falling, the risk of a child having a disease that spreads, the risk of pulling out to pass and being struck by another car.
These are things that the government wants you to be "protected from" - so they (the government) create policies to reduce the perceived risk that these bad things will happen.
One of the things Kahenman discusses is a fellow researcher named Paul Slovic. Slovic has spent a great deal of time studying public policy risk and his conclusions are quite interesting.
To quote for Kahneman's book, Slovic concludes that:
“Risk” does not exist “out there,” independent of our minds and culture, waiting to be measured. Human beings have invented the concept of “risk” to help them understand and cope with the dangers and uncertainties of life. Although these dangers are real, there is no such thing as “real risk” or “objective risk. (Kahneman, Daniel (2011-10-25). Thinking, Fast and Slow (p. 141). Macmillan. Kindle Edition.)
Interesting - no such thing as risk...
Another point Slovic makes is that things like press coverage of unusual events dramatically color people's perception of risk.
For example, various examples are given where very unlikely events, such as botulism poisoning, are juxtaposed to comment events, say death in a car accident. The unlikely events, which often receive extensive press coverage, are often to believed to be much more likely than they really are.
I touched on this back a few months ago in "The Risk of the Cure."
People tend to panic over very unlikely things and to even make extensive investments in things which will have a very small appeal to the broader environment.
So what is risk then?
According to Wikipedia risk has many potential meanings, for example, "the effect of uncertainty on objectives" or the probability of uncertain events.
But events exist independently of any human attempt to quantify them.
Lightening strikes the earth all the time - the presence of humans has little or no effect on this.
But human's create the notion of "the risk of being struck by lightening" as a means to relate this natural and unpredictable event to themselves and their interests.
I think that one of the interesting things not said by Kahneman, which in general, at least so far, I see as an overall criticism of this book, is that relative to the human mind risk is not based on "probability" at all.
Instead the human mind sees "risk" as "difference."
Your mind views the world in from a certain perspective - things ordered in a certain way with some set of rules that your mind perceives as the "proper order" of things. When something changes that's important to your mind.
In Kahneman's terms "System 1" detects a difference which raises an alarm.
Now, throughout non-technical human history, i.e., before about 1400 or so, this was an adequate means for functioning in the world. When things were "different" they required checking into: more rain than "normal", less rain than "normal", growling at the edge of the forest, unusual symptoms, unusual actions, and so on.
You can see this today with any pack of dogs. When a single dog detects a change, i.e., someone coming down the driveway, he barks. Why? Because the "System 1" in his brain says "I see something different" so I must alert my pack-mates because difference can equal danger.
Humans as I see it are exactly the same - your mind is interested in the "something different" part of the world.
The "everyday same" part does not cause alarm at all - its what's expected.
Its only the unexpected that matters - when the brain detects what is not ordinary or normal.
Kahneman goes on to discuss the "availability cascade." According to the book "... availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action." (Kahneman, Daniel (2011-10-25). Thinking, Fast and Slow (p. 142). Macmillan. Kindle Edition. )
The "minor event" is an event seen by the public as one involving an amount of risk the public may not wish to take, e.g., water pooling around the base of a dam or a crack in a bridge support.
The "cascade" Kahneman refers to is really what I call a "witch hunt."
Kahneman limits his discussion of the public's reaction to things based on public policy issues but in fact I believe its much wider.
The notion of a witch hunt is exactly the same. A small outlying incident, basically one which is, for example, not easily explained, i.e., attributed to witchcraft, causes panic to spread ultimately resulting in government policy. Just like someone showing up in town after seeing water at the base of the dam (the water may not indicate the dam is leaking but as news spreads that becomes irrelevant). No mother want's her children washed away in a flood when taking some form of action could easily prevent it.
Dogs react in a similar way to an unusual event - sometimes bringing the entire pack into a frenzy. For example, an unrecognized car comes down the driveway and then an unrecognized person emerges. The pack turns into a snarling mob against itself as panic spreads.
So for me most risk, and particularly public policy risk, is simply a perception by some small portion of the public that turns into a widespread panic or issue.
Now what's interesting is that often, based on misguided intentions or a poor collection of basic facts, the actions taken by government or the public have little to do with the actual supposed risk.
And often risk is not assessed where it should be, i.e., the dam that burst causing the flood. Sometimes a small percentage of the public sees the actual risk but often its the case that "no one saw it coming" as in "the big one that wipes out California."
Kahneman's mistake with risk, I think, and it also relates to some future posts I plan, is that logic and probability are tools created by man in an attempt to scientifically and mathematically quantify the risk of the "unknown." But they do not define risk as a human mind does. Differences and their potential risk in the environment are not based on calculations or science because the mind was not developed in that way. Instead minds evolved (in humans and dogs) sitting around and noticing a meaningful difference in the world.
Those that "saw it coming" and were "prepared" were rewarded with a better chance of survival.
Before 1400 or so people basically had only their own senses to assess risk with. Certainly there was some historical aspects, i.e., the river floods every year, but risk of whether the wolf burst out of the woods to snatch a child could only be mitigated by vigilance, and, in particular, the detection of "differences" from the norm, i.e., the forest became very quiet before the attack.
So I think that Kahneman is attempting to some degree to force a scientific notion of logic and risk onto what is not a logic nor mathematically driven human process.
To some degree the human mind is still using the notion that we are sitting around a fire in the dark and danger lurks outside of the light cast by the fire. The mind repeating "What's different?" over and over and looking for signs of danger.
So risk then is an scientific attempt to quantify danger with some mathematical formula, i.e., the chance the dam will break is .001%. However, if water is pooling at the base of the dam and I live downstream my mind will not be happy with the fact that "its just rainwater collecting for a larger-than-normal" run of rainy days. My mind (Kahneman's "System 1") says - "no way - water at the base of the dam = bad news for my survival."
The notion of intuition, e..g, "woman's intuition" falls along these lines. Your mind may perceive cues from the environment which don't fully penetrate your conscious mind (and Kahneman devotes a large section of the book to this idea as "priming").
But science has no ideas what cues trigger this mental "System 1" so how can it really assess what human's due with risk?
So while scientists and public policy makers attempt to quantify known risks, i.e., the probability the dam will fail if built badly, this is on the whole a different thing entirely than the perception the mind makes about risk at any given moment.
And at least so far this is the other fatal flaw in Kahneman's book - that his own notions of things - his very experiments and means of collecting data, for example, are as biased as those of the participants in his studies. Similarly, his notion that somehow science is going to understand how and why human's make decisions really isn't too meaningful if it only measures superficial results and does not probe further.
No comments:
Post a Comment