Thinking, Fast and Slow – Scientists are human too

I’ve just finished reading the excellent book “Thinking, Fast and Slow” by Daniel Kahneman, who is a psychologist who had a profound impact on economics; he won the Nobel Prize in economics in 2002 for “Prospect theory”, which basically tries to provide a reliable model of observed human behaviour of choices, for example, up weighting low probability events, and in particular distinguishing scenarios which are gains vs losses – we are all loss-adverse, and so put more negative weight on losing something than positive weight on gaining something.

The book is great – not only are the multiple ideas he presents in the book important, he does this in a both scientific rigorous way and with considerable playfulness and humility – making the reader realise that he or she is just as prone to the same “mistakes” as anyone else, and describing some of his own behaviour in these terms, showing that he himself (just like basically every human…) has the same blind spots. I strongly recommend it.

Reading this book made me reflect on scientific practice – there are too many ideas to explore each one, but I’d like to focus here on his first theme – the difference between the fast-thinking, intuitive “System 1” personality and the slow-thinking, deliberative, work-it-out-with-pencil-and-paper “System 2” personality. System 1 is your “gut instinct” and actually makes most decisions in your life – it evaluates everything constantly, noting things which are surprising, and provides some explanation for anything that needs a decision – often focusing on previous experience, and relying a lot of the ability to create a narrative that fits the situation one sees. System 2 – the deliberate, conscious system, takes more effort (elegant studies could actually quantify this effort in a variety of ways) and is only activated when needed – most obviously for “clearly complex problems”, such as writing down a non-trivial mathematical sum (what is 17 x 33? You can’t do this via System 1, so you have to activate System 2 to solve this). System 2 also has a sort of lazy oversight of the options emerging from System 1, but- and this is the critical point – System 1 is incapable of distinguishing the times when its narrative explanations are truly a good fit for what is going on and
when it’s just winging it completely, often subtly shifting the proposed question into an answerable question. System 2 therefore only catches a small proportion of the errors thrown up by System 1, with a sort of cursory “does it feel right” and then moves on. How one poses (or, in the jargon – frames) questions is often the largest predictor of how people answer it. Again, Daniel walks through numerous fallacies – about betting, happiness, risk taking, trade offs – which everyone – including professionals, statisticians and scientists – make. It is humbling experience being forced to realise that you are actually as “irrational” as anyone else.

Many people think of scientists as some of the more quantitative, rational people, and indeed the process of science prises rationality above perhaps anything else. But – as all scientists know – we are very human, and there is a considerable amount of on-the-hoof thinking; most obviously when we are (in effect) betting in which sets of experiments or analyses will give us the most insight on a problem, but if we’re being honest, on lots of other things – on how good a piece of science is due to it’s spoken or written presentation; on using the reaction of other scientists to help shape your own judgement; of following fashions in science. The power of narratives are particular strong in science. This reminds me of a recent(ish) paper where it got slaughtered on the first review, with the reviewers saying it was “too dense” and “just about statistics” with no impact. On the revision, for each graph we had showing a genome-wide trend, we also on the left picked out an example genomic location which was a sort of “ideal” situation showing this case. The change by the reviewers was striking, although in fact as the genome is such a large place, with a short Perl script you can find nearly any configuration of events you might want. These examples though established a narrative, and appealed to the readers “System 1” – then when we invited the System 2 to assess the P-value of the spearman’s correlation (which were all significant to be fair to us) it had presented a worked out story to confirm, rather than having to construct the story itself.

But – just as in personal life – the emphasis on narrative can be very misleading. The focus on creating clear cut stories encourages people to leave out inconsistent results or explore places that their models (or narratives) don’t provide an answer. In many ways we have the process of competition between scientists in a field to counter act this, though there can be the danger of an entire field creating a self-sustaining narrative in which alternative scenarios are not explored. This sort of “narrative buy-in” is a feature probably of all human endeavours – be it businesses, financial markets and intelligence/government services – but science is not immune from it. This thinking has also lead me to understand the aged old idea of thesis-antithesis-synthesis. I always previously thought that this was sloppy thinking, and people accepting a compromise position (synthesis) whereas in science it was more likely that one approach is right and one is wrong. However, this System1, narrative based thinking suggests that most scientific positions contain both well thought through pieces, with a high amount of observations consistent with it in a narrative web that probably has either weak or often contradictory evidence. When two of these positions meet, the narratives might be inherently opposed (thesis and antithesis), but by examining their differences, one hopefully creates a new narrative (a synthesis) which preserves most of the hard, supported evidence in each thread. By having to reconcile viewpoints, scientists have to engage their “System 2” brains, and break things down to areas they really know and understand. It would be a fallacy to think that the resulting new narrative is perfect, but it will be better than the previous two. As human systems go, the fact that science values rational thought, backed up by observation, reproducible experiments and analysis means that I think we probably reach more rational understandings faster than many other fields, and this has helped me understand the importance of adversarial views (narratives) which may easily contain consistent components despite their apparent contradictions.   But we should not kid ourselves that Science is perfect and somehow free of this intuitive, loose, System 1 thinking.

Indeed, thinking about System 1 and System 2 processes has made me reflect on organisational and community processes in Science. I’ve always implicitly known that bouncing ideas off people is good, and that meeting, even ones which seem to be just going through the motions to create an obvious answer are in fact worth it due to the deliberation. Interestingly we are all very bad at acknowledging that we make this (frankly) awful, irrational mistakes; our tendency when proved wrong is create a narrative about why our gut instinct got it wrong in this scenario, and not actually question the whole process. Much of what people describe as wisdom is really about being more deliberate and allowing more viewpoints to co-exist in apparent conflict, thus ensuring that one can make snap judgements. Reading this book has made me (when I think about it!) trust my gut instincts less, and appreciate the importance of process and deliberation.

The other thing this book has brought out to me is that I am a clear human being – with all the loss-aversion, and narrative building fallacies that our brains have. As is nearly every other scientist. Acknowledging this is the first step I think of catching ourselves as individuals and our overall fields for making these “System 1” errors.

One Reply to “Thinking, Fast and Slow – Scientists are human too”

  1. This book succeeds in instilling an awareness of the many biases and heuristics that lead to errors of judgments and poor decision-making. It should be made required reading for anyone; economists, libertarians, or whoever, who still holds fast to the notion that people make decisions rationall
    getsoftwarekey.com

Comments are closed.