Previously, I challenged conventional diversification as an investment strategy, given its inability to protect portfolios against Black Swans—catastrophic events that are deemed unforeseeable—and, the more common, Grey Swan—a foreseeable disaster that is considered remote.
The difference between these phenomena, and their presumably equally dire outcomes, are subtle, yet significant. They are random to different degrees, hence their shading. The 1908 Tunguskas event, when a 10 megaton asteroid abruptly terminated its journey through space by leveling 830 square miles in Siberia, was truly random. An earthquake on the San Andreas, is unpredictable, but not so random as to be beyond the scope of imaginability.
Still, even these more foreseeable disasters are considered so unlikely that our minds, and therefore markets, don’t discount their risk. Even those experts who know better can devolve into a state of suspended disbelief in an effort to avoid cognitive dissonance. One possible cause for these reactions is often overlooked.
Left & Right Brains
Western cultures tend to privilege, so-called “left brain,” linear explanations; i.e., a causal sequence of events. The meanings or impacts of such events, however, can only be outlined in hindsight. This creates explanations after the fact, which are hopelessly useless in avoiding what they describe. Fortunately, this linear rationality is not the totality of cognitive process.
In The Fifth Discipline: The Art & Practice of The Learning Organization (2002), MIT professor Peter Senge contends that some people are hardwired to have “rich intuitions about complex systems.” This intuitive perspective relates to the ability to form general notions about complex systems, like financial crises, because “intuitions tell them that cause and effect are not close in time and space.” Thus, “they cannot be explained in terms of linear logic,” that is, inductive reasoning. In the hierarchy of decision making, leaders must not wait to take corrective action until their linear faculties have come to terms with the zeitgeist; by then the storm will have past, its damage already done. Intuition operates differently. It senses rather than sees. It knows that something is not right, that “tangible, easily measured indicators [are] masking deeper problems.” It can feel a change, without assigning to it a pressure system, humidity levels, or wind speed.
In Walter Isaacson’s classic biography, Einstein: His Life and Universe, he quotes Einstein: “A new idea comes suddenly and in a rather intuitive way.” Einstein went on. “Intuition is nothing but the outcome of earlier intellectual experience.” Intuition derives from exposure and expertise, yet functions differently than linear cognition. Einstein says he applied something more than the logic of induction to arrive at some of his greatest insights. Buffett says the same regarding his most spectacular investments.
“The simplest picture one can form about the creation of an empirical science is along the lines of an inductive method. Individual facts are selected and grouped together so that the laws that connect them become apparent… However, the big advances in scientific knowledge originated in this way only to a small degree… The truly great advances in our understanding of nature originated in a way almost diametrically opposed to induction. The intuitive grasp of the essentials of a large complex of facts leads the scientist to the postulation of a hypothetical basic law or laws. From these laws, he derives his conclusions.” (Isaacson, 118 page)
Intuition is not a straightforward story
Malcolm Gladwell’s bestseller Blink opens with a memorable story of a sculpture about which several experts had strong visceral reactions: in their guts they felt the statue was fake but were unable to articulate precisely what made them feel so uneasy. How many fell into this camp as the last financial crisis unfolded? More managers likely knew it was a bubble than knew why; even fewer took action in response. Lacking a sequential narrative of causation, “left brain” overload prevented “right brain” action.
Gladwell does not blindly trust intuition. Later in the book he describes a massive failure of intuition—the election of President Warren Harding. People voted for someone who looked strong and decisive without any other reason to believe he was so. (Do I hear George Santayana’s admonition ringing in my ears?). No less than Daniel Kahneman, author of Thinking, Fast and Slow (2011) and reigning academic authority on intuition, also warns that our biases often compromise the effectiveness of intuitive capacity.
An acquaintance and Wall Street Journal writer Jason Zweig, writing in Your Money & Your Brain: How the New Science of Neuroeconomics Can Make You Rich (2007), further examines the practical applicability of intuition. He differentiates thinking and feeling. The “reflexive,” or feeling system is essential to pattern recognition. It perceives risk and reward “rapidly, automatically, and below the level of consciousness.” As a sort of filter, it lightens the load for the “reflective” system, described by Zweig as more systematic and contemplative.
Despite its limitations, as you will see in the following post, intuition can be a powerful tool if it prevents the paralysis that results from overloaded analytic cognition. Therefore, it is an essential partner to analysis. In addition to providing system redundancy, as a pair of kidneys does, two eyes are more than a backup. They provide perspective, a depth of field. We need both for parallax vision. Next week we will examine how intuition and analysis played into the financial crisis of 2007.