top of page
Search

The Mind: A User's Guide (Pt. I)

  • Writer: HC James
    HC James
  • Apr 29
  • 10 min read


How To Think Clearly

Normally I wouldn’t feel qualified to give tips on mental organisation as I have always felt like quite a disorganised person. But maybe it is just because I do know what it is like to be the person who was always late, always leaving things to the last minute, who failed exams and missed flights among other mishaps, that I can sympathise with the feeling of being overwhelmed by stuff other people seem to find easy. My default when given any task requiring forward planning and multiple steps was to panic, or get resentful, and wait for it to magically sort itself out. Eventually I realised to my dismay that nobody else was going to do my organising for me and I needed some mental tools to help me. When I looked into this I came to realise that in a lot of my thinking I was falling prey to all kinds of traps, and in my activities I was wasting huge amounts of effort and time by doing things in a reactive and seat-of-my-pants way.


Fortunately in the past few years there have been a number of books on how to 'think clearly'. These essentially tackle the question of how our brains process and act upon information. This 'information processing' view of the brain is very much in keeping with the times we live in, and it can seem a bit dry - almost comparing the brain to a computer with input, memory and output etc. - but in order to even begin to understand the sheer complexity of the brain we need some way of simplifying things. Probably the defining text here is Thinking, Fast and Slow by Daniel Kahneman, drawing upon decades of research. More recently we have had The Organized Mind by Daniel Levitin and Brain Chains by Theo Compernolle. And then we have the airport ‘How To…’ style books for the international life-hacker on the go. My favourite is probably The Art Of Thinking Clearly by Rolf Dobelli, and I also borrow a lot here from The Decision Book by Mikael Krogerus and Roman Tschäppeler. There are certain common themes which emerge from these works. I have sorted them into four general 'principles' - 'know your brain', 'externalise the chaos', 'one thing at a time' and 'know when enough is enough'. Together they comprise a kind of basic user's guide to thinking (more) clearly.


Principle 1. Know Your Brain


'Know yourself' is one of the oldest pieces of wisdom we have, but what is this 'self'? Without straying too far into philosophy, one of the oldest concepts of ourselves is the idea that we have two sides to our mind; for example the horse and rider metaphor of Zen philosophy, the Ego and Id of psychoanalysis, or the conscious and subconscious we still commonly use today. While we ought to be cautious with any dualistic view of things - after all, we have only one brain - there’s no denying that for a lot of the time I do indeed feel like there are two parts of me, often in conflict, say late at night where one part of my mind desperately tries to get to sleep while another whirrs away like a broken machine, ruminating on all kinds of nonsense.    


In Thinking, Fast and Slow, this whirring machine is what Kahneman describes as System 1 thinking, the ‘fast’ type of thinking, as compared to System 2 thinking which is slower and more deliberative. System 1 thinking is what you might call our intuitive mind, born of our survival instinct. It is always ‘on’, always scanning the environment, including our social environment, for various threats and opportunities. It is goal driven, though the goal may not always be apparent. It excels at visiospatial awareness, allowing us to catch balls and navigate busy roads on a bike or in a car with minimal mental effort. It uses language and is able to think symbolically. It can sift through inordinate amounts of data and find the salient bits of information we need, information that may be useful to our survival needs. It is unbounded by considerations of time, and very much pulls all future and past into the present moment when making decisions. This, paradoxically, makes it difficult to fully be in the present moment, because it is continually making predictions and anticipating possible future scenarios or re-living the past. 


System 1 thinking is above all incredibly efficient and it makes appropriate decisions most of the time. In evolutionary terms, it has got us this far. However in an evolutionary blink of an eye our social environment has changed dramatically: we have gone from aeons of tribal communal living, to agricultural feudalism, through industrialization, to the hyperconnected and complex world we now live in. This has exponentially increased our exposure to information, and the main limitation of System 1 thinking is that it struggles with conflicting and ambiguous data. It is also easily hijacked by emotionally charged information, of which it is fair to say there is no shortage. Moreover, it is not reflective, it is not aware when it is making errors and this leads to all manner of cognitive biases which we’ll come to in a moment.


Firstly without thinking about it much, take a look at the following two shapes and guess which one has the largest border.  

At first glance, it is tempting to say that the square on the left has the largest border, because it looks bigger compared to the skyscraper shape on the right. But, of course, both shapes have the same border length, it is only the area that is larger on the left. If you fell for the trick, it was probably because your System 1 brain made a snap judgement. To see why they have the same border length we have to engage our System 2 brain more, for example to trace around the edge of the shape on the right, or see that the length plus height times two is the same for both.


A slightly trickier puzzle, used in Thinking, Fast and Slow, and often quoted, is the coffee and cookie question. A coffee and cookie cost £1.10. The coffee is £1 more than the cookie. How much does the cookie cost? Most of us intuitively conclude that the cookie is 10p, and even if we suspect that we are being tricked, we still struggle to see how the answer could be otherwise. To reach the true answer (cookie = 5p, coffee = £1.05) most of us again have to engage our System 2 brains and maybe even work it out using symbols (such as the solution at the end*). I say most of us because those already comfortable with numbers would be able to solve this using their System 1 thinking, because slow and effortful System 2 thinking eventually becomes System 1 thinking with enough practice. For others, putting the solution down on paper is the next best way to get the two systems working together, kind of mentally giving our System 1 a helping hand.

 

The puzzles above illustrate how a complex problem can easily short-circuit our brains. Our quick System 1 jumps to a conclusion and it is only when we look at it a second time that we see we were tricked. This can have serious implications in medicine for example, or in any scenario where we are trying to assess risk (you can look up the 'medical test paradox' online for a good example of this). Our System 1 intuitive brains are brilliant, but their main flaw is that they don't know when they are falling into a trap. Unfortunately the modern world is filled with such traps.


Cognitive bias     


An internet search on cognitive bias will reveal an extensive list of mental traps we are prone to falling into. For example there is the Sunk Cost Fallacy - our reluctance to let go of any kind of investment (a car, hand at cards, relationship, etc.) that we have sunk a lot of money or time into, even though it is only likely to lose more. This inability to let go is, of course, completely irrational because really we ought to be basing our decisions on what we have in front of us now, not what it was or could have been. Cognitive bias could be said to occur when the brilliance of our intuitive minds comes up against the complexity of modern life. We’ll limit ourselves to looking at the three better known ones:


The Anchor Effect is our tendency to strongly favour first impressions. This is very much a System 1 bias, working on the principle of right-most-of-the-time. It is little more than an energy saving measure, saving us the effort of having to revise our assumptions. Yet really there is no reason whatsoever why our first impression should be more correct than any subsequent impression. It is simply the first snapshot we have had of something at a single moment in time. The danger of the anchor effect is that we then simply favour information that conforms to this initial impression and subconsciously neglect any further information that might contradict it. This leads us to…


Confirmation Bias. We instinctively favour and seek out information that affirms our assumptions. As with the anchor effect, our System 1 thinking deems it too much effort to seek out and have to process information that challenges our assumptions, effectively saying ‘why bother?’ In the close-knit tribal living of our ancestors this was less of a problem. If anything, developing a set of assumptions, codes for living, ideas about other tribes etc, would have had a survival advantage, and there would have been little need to revise them. However in the complex social and economic environments we now inhabit, confirmation bias leads us to getting stuck in mental ruts, wasting time and mental energy defending certain pre-held beliefs, or at least stopping us from learning new things. Of course, our assumptions may actually be correct, our beliefs may accurately correspond to reality, but not many of us will take the effort to test or challenge them. After all, as pointed out in the Decisions Book, who spends time Googling counter-arguments?


Availability Error refers to our tendency to seek out three types of information above all else: we like information that is simple, available and autobiographical. Simple data is easier to digest, mentally speaking, available data is easier to find, and autobiographical data easier to relate to. In other words, we prefer simple stories over messy reality and if it is my story then it is even more convincing. We mistake personal experience for universal truths: if a waiter in Paris is rude to me, I'll tell everyone who'll listen that all waiters in Paris are rude as if it were a statistical fact. If I receive a vaccination in flu season, then become unwell with an unrelated viral infection (which is statistically quite likely), not only might I falsely attribute the illness to the vaccine, but I will assume that this true for most people, and then confirm this assumption by seeking out similar stories, of which there is an abundance. The sheer availability of a certain story seems to make it true, but of course only the evidence can do this. Evidence, especially statistical evidence, is very much a System 2 job and as we've seen emotionally charged stories effectively bypass our System 2. It's certainly no easy feat to overcome these inbuilt biases.


One of the advantages of the System 1 and 2 approach is that it is essentially neutral, not seeking hidden explanations for our behaviour, or judging whether our thinking is right or wrong. If you despair at your overthinking mind, just remember that it is a result of evolution. There is nothing 'wrong' with having a hyperactive brain. Struggling to focus is a normal response to living in an environment where we are bombarded with information, choice and stimulation. We all have a fast brain which runs on autopilot and doesn't have an off switch. Thinking in a deliberate 'slow' way requires conscious effort, which to our brain translates as extra calories, and naturally it resists having to use them. We can also see that a lot of the errors people make in their choices - in relationships, in work, with money, in voting and so forth - are not so much down to an individual failing, but more a product of cognitive biases that we all have. At their most fundamental they are a product of how our brains try to use energy in the most efficient way possible, on the principle of right most-of-the-time. There is non doubt that these biases are leveraged and exploited by unscrupulous players to make us behave certain ways, so exposing them is worthwhile, but to do so requires moving beyond our - very System 1 - tendency to judge.

In summary, while it is important to acknowledge the brilliance of our fast, intuitive brain, it is equally important to be able to step back now and take the odd reality check. If anything, it can be a release to let go of certain decisions and views which no longer serve us. Also, we can use System 1 thinking to our advantage, so long as we continue to play to its strengths. As well as being fast and efficient, it can also sift through far more data than our deliberative System 2 brain and is the part of us which usually knows us better than we know ourselves. As an experiment, the Decisions Book suggests the ‘coin toss’ trick. If faced with a tricky, though perhaps trivial decision (such as ‘do I go out tonight / what to have for dinner’ etc) toss a coin. Usually by the time the coin has landed and while still covered, your System 1 brain has sorted through the variables and knows what you would rather do, and after this you don't even need to look at the coin. The decision has been made, and we'll be looking more at the art of decision making next...


*Solution to coffee and cookie problem:


        Lets call coffee C and cookie D. The problem "a coffee and cookie cost £1.10. The coffee is £1 more than the cookie. How much does the cookie cost?" can be written as:

C + D = 110. If C = D +100, then what it D?


We can now see more clearly that the cookie, D, isn't 10, because if C = 10 + 100, then C would equal 110 (and D would have to be zero).


If C = D + 100, then C + D = 110 is the same as (D + 100) + D = 110

or D + D + 100 = 110

Take 100 away from both sides and:

D + D = 10

or D x 2 = 10

which means that D = 5






 




 
 
 

Comments


20-12-18-0225
05-10-18-3704
thumbnail_IMG_7409
mossy tree
Berry Branches
Oriental Statue
Man Running
IMG_6461
IMG_6270
IMG_2208
FullSizeRender
IMG_2497
Ocean
IMG_6271
IMG_2213
IMG_2458_sRGB.JPG
About The Writer

HC James is from London and worked as a teacher before switching careers to medicine. He currently works as a doctor in a south London Emergency Department and in his spare time visits family in California.   

 

Join my mailing list

Search by Tags

© 2023 by Going Places. Proudly created with Wix.com

bottom of page