Get a 14-day free trial on Haystack
Try now

Developers, Defensive Culture & Psychological Safety

When was the last time you didn't know something?

Tell me about the last time you made a mistake?

I believe these are fundamentally important questions to ask a future manager. The specific answers don't really matter - what matters is if they have the humility to accept their own humanity.

In this blog post, I'd like to explain why this is of particular importance.

Research on Psychological Safety

It isn't just my own experience that backs this, the importance of psychological safety is backed by empirical research too.

Google studied what made teams highly effective in Project Aristotle using double-blind interviews and survey data. The researchers studied a total of 180 teams, including 115 project teams in engineering. According to Google: "Psychological safety was far and away the most important of the five dynamics we found".

Yes, Google is one very unique workplace - but these results have been validated across the industry.

By 2019, the State of DevOps reports consisted of 31,000 data points, with a further 1,000 engineers adding their voice that year. The 2019 State of DevOps report "found that this culture of psychological safety is predictive of software delivery performance, organizational performance, and productivity".

Whether your goals are around software delivery and operational (SDO) performance and organisational performance, or productivity, psychological safety is essential for both.

Defensive Culture

The opposite of a psychologically safe culture is a defensive culture.

In companies with a defensive culture, there is a focus on individuals avoiding taking calculating risks, for fear of being blamed for a mistake.

This is not a "safety" culture, indeed defensive cultures can often work contrary to producing safe systems. For example; in a defensive culture, there can be vexatious levels of testing before software is deployed (long-running E2E test suites for very low-risk functionality), but no monitoring when software is deployed in production.

Deployment of software is held back on the basis of fear, without considering the balance of risk and reward. There is a reluctance to test new ideas and product functionality out of fear of being blamed.

The culture does not reward success (through taking calculated risks) but instead individuals plateau from fear of failure.

Corporate pay scales are usually not oriented towards rewarding successful leaders and in the worst cases, harassment and abuse is directed to either those who flag issues or those who are seen as responsible for making mistakes.

Staff members (often including senior engineers and managers) are reluctant to admit the limits of their own knowledge or their own mistakes.

Feedback is often something that is given, but not taken. In other words, feedback is a tool to put subordinate employees "in their place" rather than a tool for blameless learning.

Developing Psychological Safety

In organisations with strong psychological safety, people do not personally fear making innocent mistakes and are encouraged to take calculated risks. Leadership understands that professional engineering discipline requires strong communication, an “open reporting” approach and avoiding a “good news only” or closed culture. Professional engineering bodies (e.g. Engineering Council UK) reiterate this in their professional guidance on risk.

In my own career as an engineering manager; I have found it important to share my own mistakes both so my staff challenge my own work when it is wrong, but also comforting junior engineers when they have made their own first small mistakes.

Fundamental to developing psychological safety in an engineering team is that the leadership admits to their own fallibility and are open to feedback on their own work.

You might also like

Want to drive better software delivery?