Explaining Machine Learning Concepts to Non-Technical People

James Kotecki
Machine Learning in Practice
6 min readJan 9, 2020

--

Can ewe hear what I’m sayin’? Photo by Bill Fairs on Unsplash

Grab a friend and try this quick experiment. Think of a simple song that almost everyone knows, like “Mary Had a Little Lamb.” Then using your finger, tap out the beat of the song on a table. Don’t hum or sing the tune - just tap out the rhythm while the song plays in your head. What do you think are the odds that your friend can guess the tune you’re tapping?

In Harvard Business Review, Chip and Dan Heath write about a 1990 Stanford experiment by Elizabeth Newton involving a similar game.

Over the course of Newton’s experiment, 120 songs were tapped out. Listeners guessed only three of the songs correctly: a success ratio of 2.5%. But before they guessed, Newton asked the tappers to predict the probability that listeners would guess correctly. They predicted 50%. The tappers got their message across one time in 40, but they thought they would get it across one time in two. Why?

The reason for the yawning gap between the communicators’ expectations and the recipient’s reality, say the Heaths, is the “Curse of Knowledge.” They write that “once we know something — say, the melody of a song — we find it hard to imagine not knowing it. Our knowledge has ‘cursed’ us. We have difficulty sharing it with others, because we can’t readily re-create their state of mind.”

Young lady or old woman? It’s the famous “Boring Figure.”

You’ve certainly seen a version of the above image before. Is it a young woman or is it an old one? It’s both, of course, and your mind switches between them. But can you remember how you saw it before you knew the trick? Probably not.

The Curse of Knowledge applies directly to machine learning projects. Data scientists, like all experts, can find it challenging to communicate with ML novices. The experts may use terms and concepts that are obvious to them — concepts that make communication easier if the recipient is also an expert. But since the recipient is not an expert, the data scientists seem to be using inscrutable jargon and advanced concepts that are well beyond the novices’ capacity. The two sides are, almost literally, speaking different languages.

Technical teams need to put themselves in the stakeholders’ position. They need to empathize, realizing that business leaders often don’t have extensive machine learning and data science training. That’s what the technical team is for. This is not to say that the business leaders aren’t smart. There are plenty of intelligent people who could be machine learning experts if they had years of training and experience, just as there are plenty of people who would be capable of becoming doctors if they’d gone to medical school. Bridging the gap between ML expertise and business leadership is not a matter of “dumbing down” information, because it’s never a good idea to assume there are “dumb” people in the conversation. Instead, consider these ways to reverse — or at least weaken the hold of — the Curse of Knowledge.

Frame Answers in Terms of Business Goals

Start every explanation with a repetition of the overall business goals/subgoals you are trying to achieve. Business impact is always the ultimate goal, so make that the shared foundation of every client interaction.

Avoid or Explain Jargon

In their article, the Heaths recommend “concrete language” — but that’s a potential challenge for data scientists. To them, the most concrete language may be technical terms that business leaders don’t understand.

Don’t be that guy.

It might be more appropriate to call it “bedrock language” — as in, simplifying language to a more foundational level. In fact, the ability to explain a concept in simple terms is a sign of true mastery,

For inspiration, check out the Explain Like I’m Five forum on Reddit, look up Wired’s “5 Levels” video series in which an expert explains a concept at five different levels of complexity, or the book “Thing Explainer,” which lays out scientific concepts where “titles, labels, and descriptions are all written using only the thousand most common English words.”

Use Analogies

As a rule of thumb, try to explain technical concepts with analogies a 5th grader could grasp. Like the avoidance of jargon, this skill requires a strong grasp of the material. In other words, if you can’t explain something with a simple analogy, it may be because you don’t understand it well enough yourself.

Framing complex concepts in terms of playing sports, driving a car, or other everyday tasks gives everyone — from novices to experts — a framework for communication.

If it helps, you can think of analogies in terms of stories about real things. In their HBR article, the Heaths recommend stories as a Curse of Knowledge combatant, because “they force us to use concrete language.” ML projects can be abstract, even lacking a user interface to judge success. Analogies can help make them more real.

Say Less

Keep it simple.

“You can always add detail, but you can’t take it away!” says Bill Franks in his article, “A Common Trap That Undermines Analytics Credibility.” Eager to show what they know or to dive into the nuances of a given issue, technical experts can overwhelm with too much information and/or waste time on commentary that others don’t follow (and don’t even really need to know).

It’s better to start with a simple, short answer and then let stakeholders ask for more information if they want it. Simple sentences are much less likely to be misinterpreted. Compared to technical experts, stakeholders are likely to care more about results than process details.

Layer Your Explanations

Build a foundation of basic concepts before layering on more complexity. Stakeholders may be very capable of comprehending more advanced concepts if they can stand on a solid foundation. And even if they don’t get advanced ideas, at least there’s a basic shared understanding.

It can be good to start with business concepts before talking at more complicated, technical levels.

Concepts are built one on top of another. For instance, you can’t understand machine learning if you don’t first understand data.

The key here is to start with the basics. If you start with the harder stuff, some may assume they can’t follow the conversation and tune out. To quote Bill Franks again: “If you always drill down to the gory details, you’ll fail with all but those wanting the gory details. If you always start high level and only get more detailed as asked, then you can succeed with everyone!”

Artfully Probe for Understanding

If there’s doubt about whether stakeholders actually know what’s going on, it’s always a good idea to ask. Do this tactfully. It’s not a good idea to put anyone on the spot — it can cause anxiety, and people might feel pressure to pretend they get it. Instead, try phrases like:

▪ “Before we dive in, would it be helpful to start with a high-level review here?”

▪ “As you know . . . [Insert simple explanation before going on to the harder one.]”

▪ “For the benefit of those who are joining us for the first time . . .”

Technical experts may wonder whether their reputation (or pride) will suffer if they distill their training and experience into simple terms. Aren’t stakeholders expecting them to be, well, technical?

Yes — but progress can only take place if both sides understand each other. The risks of miscommunication (failed project, different expectations, etc.) are far greater than the risk that a data scientist will briefly feel silly for using an analogy.

Simple communication at the beginning creates a platform upon which machine learning experts can be as technical as possible for a given audience. Once shared understanding is confirmed, the conversation can go deeper. If there’s no shared understanding, the conversation will go nowhere.

James Kotecki is the Director of Marketing & Communications at Infinia ML, a team of data scientists, engineers, and business experts putting machine learning to work.

--

--

James Kotecki
Machine Learning in Practice

VP, External Affairs for Agerpoint, a spatial intelligence platform for crops and trees. Also a talk show host for CES.