by Monty St John
As humans go, it doesn’t take us long to form an opinion or make a decision. We go with our gut an astonishing amount of times. Reflection, when done, seems to give unprecedented weight to the smallest of facts that resonate. In fact, Daniel Kahneman, a psychologist that won the Nobel Prize in economics on decision making, in his book Thinking, Fast and Slow, describes the ease with which we draw conclusions. “The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.”
Dr. Kahneman points out that we are quick to leap to conclusions because we give too great a weight to information that’s immediately in front of us. That we fail to consider the information that’s just out of sight of the immediate. He called this tendency, “what you see is all there is”. Personally, I think of it as looking through a microscope or a spotlight — everything inside is crispy illuminated while everything else fades into the background.
Consider a situation where we observe a series of phishing attacks on our organization, combined with a successful malware infection or two with plenty of indicators similar to a known adversary group. As an analyst — security, operations, intelligence, etc. — how often do we succumb to the spotlight effect? To put it in the limelight and look at little else? Offered this information, we find it easy to take the readily available path and raise the flag of APT .
Of course, a microscope only zooms in to cast light on a single point. All points around fade away and become invisible. In this instance of what obviously looks like APT, we don’t immediately think to ask the obvious questions. For example, instead of declaring it instantly a certain nation state or high powered criminal group, do we know the observations we have are unique to them? Could the indicators reflect another different or similar organization? Could they be a black flag put forth by a different adversary? Is what we see even enough of a picture? Have the events observed been laid out on an action-artifact decision tree to understand what’s missing from observation, but must be in place to have occurred?
Additionally, what if we dug deeper and discovered it wasn’t the isolated event we thought? That it connected to early attacks that we missed because we didn’t have the pattern to identify it? Worse, perhaps we weren’t looking at all! When we shift the focus of the light, zooming in and out, casting it about to look in different directions, the situation we think is defined rapidly starts to look different. It’s impossible to form a good conclusion — build a case for quality analysis — without performing this shifting point of view. Yet, building a case for an opinion was incredibly easy without doing it.
That, of course, is the core difficulty of decision making. Our view of information — under the microscope, in the spotlight—is too limited to make a good decision. What we see through the small lens is rarely enough. Worse, without good process that enforces the widening of the scope to look at a more meta level, we forget a narrow focus exists at all, living within the tiny confines of our sample slide without regard to the world beyond it.
It might be rude to say, but humanity has a terrible record when it comes to decisions. Our brains are full of irrationality and bias. Dig into stats on career choices, how people trade on the stock market or examine the choices made by businesses. Think of all the high stakes mergers and acquisitions that have happened in the past couple of years. AOL–Time Warner, HP-Compaq, Quaker-Snapple — these are just some of the big ones. An analysis of 2,500 such deals by the Harvard Business Review showed that more than 60% of them destroy shareholder value. Can you imagine? In such high stakes deal making, more than half the time, value is destroyed instead of built?
Even when we go with our gut, it is not necessarily better. I’ve personally heard and use that advice and it’s been a mixed bag of regret and happiness. It’s pretty questionable — both referring to it in third person and making it out to be an oracle of wisdom. All I have to do is think about whether I should eat that late at night or stay up til 3 in the morning. My gut says it’s okay, I can burn those calories off tomorrow and that sleep, well, its optional to a certain extent. The fact is we just can’t trust our guts. It’s no more a good guide than our irrational and emotionally tempered brain.
What should you trust? If the gut is out and our brains along with it, what’s left? If you happen to be an analyst, it is pretty easy — you go with analysis. The more structured and rigorous, the better, or so the thought goes. Even these critical thinking skills come with their warnings. Confirmation bias. Fallacies of logic. Analysis isn’t bereft of its flaws.
What strengthens analysis is building in good processes. Where the goal not so much the heights of analytical thought, but of weeding out uncertainty. Process forces a look into contradictory perspectives and data. Think A/B testing or Devil’s Advocate activity. We use process to force the widening of the spotlight to see a bigger scope of the picture. Employed properly, good process weeds out faulty logic. Conversely, the opposite is not true: superb logic does not mean precision process.
Why a process? Simple — understanding our shortcomings is not enough to fix them. Does knowing you are ill from some sickness make it go away? Or, does knowing your vision isn’t 20/20 make you see better? More appropriately, can we correct a bias in our mental model by knowing about it?
Process is the counter to our flaws, the irrationality of the brain and confused advocacy of our gut. It’s — to make a pun — a process. Process guides us to make better choices, by confronting biases and limited viewpoints — and showing how to overcome them.
If you have read this far, you might be wondering where the process is that I’ve been hollering about. Good point. A note, however. Just like your brain and gut are unique — so is your process. Admittedly, some guide posts do exist, as you’ll see below.
- Look for choices. Strong arm away the tendency to use a narrow focus. Look at the meta picture. Fight to find options — then analyze them to break away any confirmation bias that might have crept in. When you have those, then …
- Throw it to the wolves. Whatever assumption you make, get it in front of others to get beat up. Never live only inside your head. Ask questions, seek and collect data beyond what you know or can observe immediately. After testing it — make the choice. With that in hand, just make sure you didn’t step on your sword …
- Get some space. Step back. Look at the option chosen from a different viewpoint, another lens. Evaluate short term impacts to make sure they don’t unduly influence your goals. Once you’ve got it, live with it. Be optimistic but …
- Belly up to the table. All that decision-making and process can be totally wrong. Its an uncertain future, with influences that can skew any certain odds thought to be in play. This point is where you own it, whether it falls flat or powers into space.
A couple of points to close. No process is perfect. The only true assurance that I can give is a tiny lesson on probabilities. Following a process invites consistency and consistent process always ensures a more stable outcome than a chaotic one. The above process is pointless in a firefight or when someone pulls out in front of you on the highway. I used to tell my daughter that if she had five fingers of time — one minute per digit — for any decision, then she had time to use a process to make a decision. I invite you to do the same.
For an easy-to-download, printable Infographic and more information on our offerings on Incident Response click here.