top of page

Data Analytics & Data Insights

  • Writer: Gee Virdi
    Gee Virdi
  • Mar 12, 2022
  • 3 min read

An Old (But Great) Story That Perfectly Illustrates the Difference Between Data Analytics and Data Insights

During World War II, the Allies mapped bullet holes in planes that Nazi fire had hit.

They planned to strengthen the planes by reinforcing the areas that had taken the most bullets (shown as red dots in the picture).

Theoretically, this was a logical deduction. After all, these were the most affected areas. I would also add: this is exactly what artificial intelligence would have done as well, based on correlation and statistics.

But Abraham Wald, a mathematician, came to a different conclusion: the red dots represented only the damage on the planes that were able to return home.

Accordingly, we should reinforce the areas where there are no points—because those are the places where a plane would not survive being hit.

This phenomenon is called survivorship bias. It happens when you focus on the things that survived, when you should be focusing on the things that didn’t.

What are you looking at in the current crisis? Where are we taking bullets – or where should we act? Most importantly, are you using our amazing human brain and human intelligence to see things in ways that no artificial intelligence may be able to conceive in the near future?

We (humans) possess the unique ability to learn and apply our knowledge, combining critical thinking, holistic reasoning, creativity, and emotional intelligence. We must leverage what is specific to us, as humans, together with AI. In the end, our intelligence remains the most important one—when we use it.

Decomposition is the idea of breaking complex problems into smaller parts. For example, a problem could be broken into four smaller problems, and each of those into four smaller problems, until you arrive at a point where you can start solving the smaller problems. This creates momentum and confidence, McDonagh-Smith said.

Pattern recognition, or recognising patterns of success and failure and being able to apply them in adjacent or different domains. For example, if work has been conducted successfully in one area, we could transplant that strategy into other areas to drive efficiency.

Abstraction. While many think abstraction means being vague, abstraction in algorithmic business thinking does the opposite—it removes the noise from the signal, McDonagh-Smith said. Amid so much data, being able to abstract and remove unnecessary elements for a given task is especially valuable, allowing people to focus on what’s important.

Algorithmic partnership between humans & machines. The first three cornerstones feed into the evolving relationship between humans and machines. “Algorithmic business thinking is humans and machines working together on problems.”

McDonagh-Smith’s double helix model features the digital world and physical world bound together by human traits, such as critical thinking, collaboration, and compassion. He highlighted three other key traits:

Creativity, which helps people make the most of artificial intelligence technologies. “Technology allows us to do great things, but we’ve got to figure out what that means for our business model innovation and what it means for actual work in our organisations, and creativity is key,” McDonagh-Smith said.

Curiosity, which helps companies find new directions and possibilities and disrupt the status quo. This situation often means being comfortable with new possibilities and directions. The more curious you are, or the more often you’re able to apply your curiosity, the broader the landscape you create to operate in,” McDonagh-Smith said. “It’s probably one of the best tools that we have to counter bias, as well.”

Consilience, which means unification. “I think we can unify physical and digital, human and machine capabilities,” he said. “I think we can unify the past with the future and the present.”


Data observability is an organisation's ability to fully understand the health of its data system. Data observability uses automated monitoring, alerting, and triaging to identify and evaluate data quality and discoverability issues. This leads to healthier pipelines, more productive teams, and happier customers. In the coming months and years, it won’t be enough to apply a reactive approach to data quality. Instead, the most successful data engineering teams will embed data observability into their daily operations.

Comments


bottom of page