Problems aren’t inherently moral.

Problems aren’t inherently moral.


CNTRD is fortunate to collaborate with a lot of emotionally and intellectually intelligent humans—many of them nurture deep spiritual beliefs and practices—others carry atheist views. 

Yet, we all frequently inquire about morals. Both implicitly and explicitly. Our inquiries most commonly arise in the context of problems we work on, situations that arise and the clients we work with. Given the breadth of interactions that inspire moral inquiries, we’ve developed a mantra: problems aren’t inherently moral. 

We have the mantra because we define morality as a human construct. Morality changes over time and is relative to conditions. We don’t anchor morality to our views—because morality is a fixed state of perception; it reduces people and problems into binary categories: right or wrong, good or bad, yes or no, should or shouldn’t. Life is rarely, if ever this way. We reject such fixed states and try our best to avoid them. 

Fixed states of perception can produce rigid boundaries that constrain novel interpretation that positively influences the critical thinking process. In this way moralization alters critical thinking and supplants it with a thinking process based on rules that may not be useful. Moralizing problems is a useful shortcut to an unsuitable outcome: a predetermined mindset formed without inquiry. 

Acknowledging that our moral landscape affects our thinking can be uncomfortable. For many of us, it brings us to a space of “uncertainty.” Moral judgements are efficient at getting to certainty while amoral perception is more ambiguous. And well, humans, especially in developed markets, love certainty. Certainty is synonymous with control. And, wow, do we all like control! Especially in business. Risk mitigation is a love language in business. But in innovation, and more specifically, problem solving, it’s a barrier. 

You might be asking yourself, “So what, Kevin? Society needs morals and ways to control problems. Morality gives us a way to manage problems and mitigate harm.” 

Well, yes, they do. 

And they also lead to flaws in logic because morality is a rule-based decision making process, and rule-based decision making processes are incomplete processes. No one, as far as I’ve met, has all the perfect information needed to know all things at all times about a presenting problem. Classical economics is incomplete in its description of rational players. Life is wild. Reality is messy. Thinking and cognition are flawed. But that doesn’t mean we shouldn’t critically think. Regardless of incomplete information. 

So, why am I pointing all this out? 

I am pointing this out because the awareness of moral biases aids us in being more useful critical-thinkers, creative-doers, problem solvers, citizens, friends, allies, and lovers. Healthy community, society, relationships, etc thrive when we understand that we’re biased. Everyday of my life I face my own biases and that of my community and society. I watch myself and others disconnect or bond based on how we work with our biases. Fearlessly disclosing our biases and judgments can bring great healing. Hiding or masquerading them as righteous and correct creates division. I have played out all aspects of division and connection as I’ve learned to heal my need to moralize things. Over time I set about to create CNTRD as a counter weight. CNTRD is a safe space for us to explore our mental models. Not just about clent work but our life. We’re a community of deep thinkers and passionate lovers of life. 

When we embrace our morality and then suspend it—and hold it as a frame of reference while we explore things from an amoral perspective—we are not ignoring morality. We expand the possibility for what we can perceive and sense. It means we look at the implications of our own moral judgments. It means we see morality as a useful problem solving tool—part of a larger tool box—versus it being the entire tool box and the only tool in it. 

For example, we have been asked, “How can you believe in regenerative design and work in big ag?” We work in big ag because we want to be part of the digalog, we want to offer alternative views and increase our understanding of the conditions and problems. We want to learn and share information. Our morals can be one diagnostic  tool we bring out. Then we set it aside and explore with others. Then we compare what we’ve discovered to see the patterns in the information we’ve explored. CNTRD recognizes that to effectively innovate a problem set if we need useful information about the conditions in which they arise. But the process is messy and complicated sometimes. 

For some team members the landscape is too much – and they opt out of projects or leave them early. We respect and encourage their opt-out. We all have freedom of choice—to make choices based on our belief system. Even when team members take a moral position that restricts them from engaging in a project, they still end up offering valuable information to the project. Their moral position becomes one of many frames we use to diagnose and solve a problem. 

We don’t moralize, moralizing. Our desire is to solve wicked problems with awesome people. All views are welcome. We’ve learned that the best way to solve a problem is at the fringe.  In Part Two of this topic, I’ll go deeper into the fringe.