Friday, August 29, 2014

Canoeing your way to meeting effectiveness

My Q&A with Dick and Emily Axelrod in s+b:

Dick and Emily Axelrod’s Method for Holding Better Meetings


The idea is borrowed from Eric Lindblad, a vice president at Boeing and general manager of its 747 program, who adopted voluntary meeting attendance as a feedback mechanism. He figured that if people didn’t show up for his meetings, the meetings needed to be either canceled or improved.

The Axelrods, a husband-and-wife team specializing in organizational development, wrote the book for executives whose meetings fell into the category of needing improvement. They believe that far too many of the estimated 11 million meetings held daily in the United States are mind-numbing, energy-sapping encounters during which participants are more likely to be motivated to hide from work than to get it done.

If you suspect that people might not show up to your meetings if they had a choice, read the interview here...

What's your error culture like?

My new book post is up on s+b:

Risky Business

In 1934, Max Wertheimer, a pioneer of Gestalt psychology, decided to see if he could stump his pen pal Albert Einstein with a math problem. In a letter, he wrote:

“An old clattery auto is to drive a stretch of two miles, up and down a hill, /\. Because it is old, it cannot drive the first mile—the ascent—faster than with an average speed of 15 miles per hour. Question: How fast does it have to drive the second mile—on going down, it can, of course, go faster—in order to obtain an average speed (for the whole distance) of 30 miles an hour?”

Being a math wizard, who can cypher at lightning speed, the answer immediately sprang into my mind: the downhill run would need to be driven at 45 miles per hour (mph). That’s wrong, of course. It would actually require four minutes to travel the entire two-mile run at 30 mph, but it has already taken four minutes to travel the first mile at 15 mph. By the time the car gets to the top of the hill, it’s impossible to average 30 mph over the whole run, unless some of Einstein’s time-warping ideas have been embedded in the car’s design.

But I feel a little better knowing that the problem initially stumped Einstein, too. As Gerd Gigerenzer, director of the Max Planck Institute for Human Development, tells the story in his new book, Risk Savvy: How to Make Good Decisions (Viking, 2014), “[Einstein] confessed to having fallen for this problem to his friend: ‘Not until calculating did I notice that there was no time left for the way down!’” Gigerenzer uses the anecdote to illustrate an undeniable reality: We all make mistakes, even bona fide geniuses.

Indeed, when it comes to decision making and risk, the real problem isn’t so much making mistakes, but rather the fear of making mistakes. “Risk aversion is closely tied to the anxiety of making errors,” Gigerenzer writes.

When that anxiety is embedded in an organization’s culture, it promotes “defensive decision making”—decisions that seem to offer protection against negative consequences, but can result in suboptimal outcomes and greater risk exposure. A common example offered by Gigerenzer is hiring a large national vendor with a well-known name even though a smaller, local vendor would provide better prices and better service. Just because “nobody ever got fired for buying IBM” (as the old IT axiom went), doesn’t necessarily mean that buying IBM was the best decision for the buyer. 

I asked Gigerenzer how you can tell if your company has a “negative error culture” that’s spawning defensive decision making. “If the leadership in an organization pretends that errors will never occur; if it tries to hide mistakes when they do occur; or if it looks for someone to blame when they can’t hide mistakes, you can bet that you’ve found a negative error culture,” he replied.

Echoing what several other business book authors—including Tim Hartford, Megan McArdle, and Ralph Heath—have told us in the recent past, Gigerenzer recommends that companies, as well as individual professionals, reframe how they view errors. He points to the commercial aviation sector as a case in point. The large-scale tragedies that can result when mistakes are made in-flight have forced the industry, and its regulators, to thoroughly examine every error, using a rigorous and transparent process of analysis and response, often in full public view. Increasingly, the industry is also working proactively to identify potential errors and prevent them. This is a major reason why air travel is the safest form of transportation.

Not all industries require such an intense focus on mistakes. But every company can benefit from what Gigerenzer calls a “positive error culture.” Such a system doesn’t try to make mistakes or even welcome them. But when errors do occur, they aren’t swept under the rug. Instead, they’re treated as valuable learning opportunities that help companies avoid the repetition of similar mistakes in the future.