Advertisement

3 lessons IBM's Watson can teach us about our brains' biases

By Jane Porter

What can artificial intelligence teach us about human intelligence? Quite a bit, it turns out. IBM's Watson had its crowning moment back in 2011 when the cognitive computing system famously destroyed its Jeopardy champion contenders, but lately Watson has been wading into far more murky cognitive territory—creative thinking.

"The crowning achievement of human intelligence is our ability to be creative," says Steve Abrams, director of IBM Watson. Abrams is in the business of recreating cognitive processes in computers. "I have to know how people work," he says. And one of the biggest pieces of understanding how people work is uncovering what it means to think creatively.

From Chef Watson, which teamed up with the Institute of Culinary Education to develop unexpected food combinations to Watson Tone Analyzer, which performs automatic linguistic analysis to interpret not just the informationm but the tone of people's writing, Watson is reaching into all sorts of creative applications.

And the cognitive computing system, which has been snapped up and applied in various ways by some 5,000 different application developers in the past year, can also teach us quite a bit about the way our minds work. Years of studying cognitive computing and developing cognitive systems for Watson point to a number of ways in our brains function during creative thinking and problem solving.

"What is creativity? It's having a message to convey or a problem to solve in the face of some sort of constraints, and figuring out how to achieve that goal in the face of those constraints," says Abrams.

But what are some of the cognitive roadblocks that might be standing in the way of our ability to think and problem solve most creatively? Fast Company spoke with Abrams about what he's learned about the cognitive biases we bring to the table that a computing system like Watson is designed to overcome.

The Brain's Ambiguity Effect

In the absence of information, we tend to avoid choosing options we don't know enough about or don't feel fully confident in. Behavioral scientists call this the "ambiguity effect"—a cognitive bias in which your decision-making is impaired by a lack of information, says Abrams.

Take, for example, the choice by some investors to put their money in a less risky fixed yield instrument rather than opting for a more volatile investment like stocks and funds that have less predictable results, but offer higher returns over time. "There are people for whom that unknown factors into the equation more than it should," says Abrams. Recognizing those biases and seeking out information in the face of doubt can help counter that ambiguity effect from preventing you to make the best and most informed decision.

Our Tendency Toward Confirmation Bias

On Watson's recommendation, the chef went ahead, resisting his own confirmation bias and found the pairing was surprisingly complementary.

Another bias we're often susceptible to is confirmation bias, or the tendency to look for and favor evidence that confirms what you already believe to be true. This affects the way we approach problem solving—from the key words we choose to look up on a search engine to the details we selectively notice in favor of our own opinions.

Take for example, our preferences in food taste. When Chef Watson called up a flavor pairing between apples and olive oil, the immediate reaction from the human chef involved was skepticism. Apples and olive oil are not a typical combination. But going on Watson's recommendation, the chef went ahead, resisting his own confirmation bias and found the pairing was surprisingly complementary.

Are You Suffering From Not Invented Here Syndrome?

To be clear, this isn't an actual syndrome, but the phenomenon (sometimes referred to as NIH) is very much out there. It's the tendency to be more critical of someone else's ideas rather than your own. We tend to apply higher standards to other people's solutions rather than our own, rejecting external solutions as less superior.

In the tech world, this kind of NIH is often what leads people to reject software or developments made outside their company in favor of their own in-house version, however inferior it may be. But this happens across industries and organizations.

"Those biases that we aren't even conscious of having impact the way we interpret results," says Abrams. "We are trying to build a system that will help us come to a more reasonable set of interesting options, despite the fact that we as human beings have these biases."

More from Fast Company:

[Photo: Ben Hider/Getty Images]