Skip to main content

Learning to see (or pay attention) again

Patterns. Brains are beautifully wired for patterns.

Training instinct

Tech support was my way into software. With each call and each email, I heard common threads. In time I anticipated what was wrong more quickly and knew where to ask for missing details. Experience trained instinct.

Quality assurance engineering was my next step after tech support. The software community was trending toward cross-functional teams, I became one of two QA engineers embedded on a team of consultant developers. For a time, old habits served well, I had “domain knowledge”. We worked on new problems, it added new solutions to my experience. I asked many, many questions. Questions led to new insights, but then sometimes the questions I asked were a lazy jump at a conclusion.

When instinct fails

Last week in the last hour of the work day, I researched a bug where a visitor signs in and opens a menu and the contents are empty. The user goes to another page and opens the menu, the contents are present. I jumped:

What’s the source code here—stylesheets, markup, scripts? When did it change last? Changes introduce bugs. Oh, that change, I suspected something would slip through… maybe this try-catch isn’t catching properly?

Thank goodness I had a weekend to be wrong and not attempt to fix it.

“Seeing”

Monday morning, I gave my status update on the bug noting I suspected the last change to this codebase, a regression. It was a bold, stupid claim. The bug waited again until the end of the day when I found a few moments and looked again, this time seeing.

Think. This catch statement only logs, the core logic is elsewhere. Maybe this isn’t the problem…

I go back to the start. The menu elements are hidden with a default display: none style. I skimmed the stylesheet before but I looked again. I saw it. This time I wasn’t assuming, I was seeing.

There it was. Another selector—later and more specific than the default—removed the hidden style to reveal the content when appropriate. The selector was specific to one sign-in condition, not the one where my bug showed up. The selector needs to be extended.

Also known as: “slow thinking”

I assumed. I let gut, instinct, and over-active pattern matching give the answer. Usually it’s efficient but here it was lazy.

Patterns are fast—until they aren’t. I might have wasted days making code consistent and decoupled, but no bug squashed and a colleague hurt. Assumptions made an ass of me. Instead, “seeing” helped us work through a couple other debugging problems together. An error message pointed at the problem—we made one change, tested, observed. In our next example, we referred back to access and error logs frequently to see if a server received requests and how exactly they failed.

What I refer to as “seeing” is ignoring the urge to jump to a conclusion based only on an article headline. Instead consume each word line by line, allowing meaning to take shape only as the reader ingests. The developer took time to print an insightful message in the terminal or browser console. To be fair, error messages aren’t always helpful, this example sticks to my memory. But generally, the trend is toward better error handling and logging and I appreciate it—it’s making me a better developer.

Further reading

Thinking, Fast and Slow by Daniel Kahneman covers the subject of fast and slow modes of thinking in great depth, this video is a cheaper investment and introduces the topic well.