Fundamentals of Creating Analytic Solutions

Analytics 101

Fundamentals of Creating Analytic Solutions

Analytics 101

In today’s data-driven landscape, organizations across industries are grappling with a common challenge: how to effectively leverage data to optimize business processes, drive profitability, and increase customer satisfaction. Regardless of whether you’re a logistics company, food manufacturer, or healthcare provider, the path to creating robust analytic solutions is remarkably similar.

This comprehensive guide delves into the foundational elements that must be mastered to build a truly impactful analytics capability. We begin by emphasizing the critical importance of gaining a holistic understanding of your core business processes and their interdependencies. Too often, organizations falter by optimizing processes in siloes, missing the intricate connections that can undermine their efforts.

At the heart of the guide is the concept of “Right Architecture” – a data model that connects processes at the ideal level of granularity to surface accurate, actionable insights. Constructing this architecture requires interdisciplinary collaboration between business analysts, data architects, and integration experts, leveraging techniques like whiteboarding sessions and CRUD (create, read, update, delete) matrices.

The guide also highlights frequently overlooked pitfalls, such as establishing metrics at the wrong aggregation levels, leading to lagging, non-predictive indicators. It advocates for a “sense and response” decision framework driven by predictive metrics that can anticipate future scenarios and enable proactive decision-making.

But this is just the tip of the iceberg. The article dives deeper into critical topics like data integration strategies, identifying leading indicators through regression analysis, and real-world examples of how companies across sectors have unlocked game-changing insights.
Don’t miss out on this comprehensive blueprint for analytics success – read the full article to gain a sustainable competitive edge through data-driven decision-making

RECEIVE CASE STUDY

We'll send you full case study in PDF format by LinkedIn or email shortly.

Name

Analytics 101

Analytics – Are the same problems nagging us all?

You are a medium sized logistics company and most of your package deliveries are to corporate customers. To increase your profitability you are trying to optimize your business processes as well as increase customer satisfaction. And you are willing to step back and take a holistic approach without leaving any stone unturned. We try and answer some key questions:

1) Where should you start?
2) What should you do?

And the answer to your surprise is not very different if you are a food product manufacturing company, or medical claims processing company or even a pharmaceutical company.

Decision Support from data can help different organizations in different ways. American Airlines built Electronic Reservations, Otis elevator built predictive maintenance solution, and American Hospital Supply built online ordering systems.

Irrespective of the Industry, from a Decision Support perspective you have three domains at your disposal – Business Processes, Data and Applications. The paper attempts to lay the guide rails for an analytics design procedure that straddles these three domains. Much before an analytics design kicks in, it is the Business Processes and Data which must be obtained right.
Get your process maze right

The first step is to get the lay of the land right. Competing value chains and that means business processes would soon be the only differentiators of your enterprise in the face of increasing commoditization of products and services. A holistic approach is needed, as several of your processes will be interdependent and optimizing one while ignoring the other is like working on one side of the equation while ignoring the other.

For example as a logistics company your business processes might

include setting up new customers, get customers package delivery request, drop off and pick up customers packages, sort packages, track packages and invoice customers. A lost package during sorting could result in issues in customer tracking process, package delivery and subsequently delayed payment from the customers. As these processes are interdependent a holistic view is critical, to observe the connecting dots.

What do most people get wrong often – it’s the Data Model

The white boarding sessions by the Business Analyst, Data architect, and Integration architect need to be independent so that each serves its unique purpose. The following three artefacts must necessarily have the concurrence of business.

  • Process Map
  • Conceptual Data Flow
  • Conceptual data model (for only the entities in the business process)

A CRUD matrix (Create, Read, Update or Delete) for the data elements should be owned by the data integration architect.


The Conceptual data model is the first step to holistically understand the data that your critical business processes are

relying on. It is important for the following reasons:

  • The dimensional model is essentially a subset of this conceptual model. It is based on the data elements documented on in the conceptual model.
  • The data integration processes use the rules documented via the conceptual model to standardize / cleanse data.

 

In a nutshell the conceptual data model will help you understand your key data entities and how they are related to each other.
Looking at this data, independent of its physical layouts allows you to identify and document relationships that are neither enforced at a data or application level, but are nevertheless critical for the functioning of the business.

Get to the truth architecture, prevent costs from spiraling out
If you are trying to analyse click stream data where you need to slice and dice hits by zip code, product, date, then, you can keep pre-aggregated data on total number of clicks on a product, at a particular date, and from a particular city. But if you wanted to know how long a prospect spent on a particular section of the website you would need data with different attributes and different aggregation levels.

A proper design of the dimensional model is critical to establish the efficacy of your analytic model. In order to get the right performance and reduce infrastructure expenditures the right grain of aggregation must be obtained. The efficacy of the model is not established by the flexibility – you may have to go to the minutest level, but whether you are architected to have data at the level where action needs to be triggered.

A dimensional model provides a “star schema” where facts (in the center of the star) contain information that needs to be analyzed or aggregated and dimensions provide the ability to slice, dice and aggregate the information in facts. Dimensional models are
basically meant to arrive at the right granularity or aggregation level where the metrics first need to be actionable (defined to point to a problem). Facts contain metrics or measurements required to create metrics.

Costs of several analytic environments spiral out of control and performance suffers when proper due diligence is not given in understanding the right aggregation needs.

Thus the right architecture of dimensions is actually the Truth Architecture, the architecture that rightly connects the processes at the granularity that brings out the truth.

Leveraging data across organizations (internal as well as external) as well as striking at the right levels of aggregation levels forms the basis of a predictive model. The CRUD matrix and the data model of the applications holds the key to appropriately identifying the Data Sources. Data sources would automatically lead to ERP, CRM data, machine data, trouble tickets, invoices, product demand, weather data, social network data etc.
Getting Data Integration right
Data integration is the process of taking data from various data sources (like databases, files, tables, files, spreadsheets, web server logs etc.), at various levels of granularity, at varied point in times, with varied data quality varied definitions and integrating it to prime it for analysis.

These data integration processes form the backbone of your analytical solution. Typically they will take up 70-80% of your people and infrastructure resources and hence it has the most significant impact on your project costs, complexity and timelines.

Data integration solutions can be real time, batch or real-time / batch combination. Contrary to popular beliefs, simplicity in data integration requires greater effort, more planning and more collaborative efforts.

It requires an approach in design that evaluates various solutions for simplicity and picks the one that is easiest to develop and maintain. It requires an automated regression testing approach with a constant set of test data ensuring that changes don’t break what has been working before.

Did you get your metrics right: Is business taking the right decisions?
A CPG company found most of its Business Intelligence investments meaningless because key metrics like Weeks of Supply were lagging and not predictive. Predictive metrics hold the key to a predictive Decision Support. If your metrics must help you orchestrate a sense and response system then the indicators need to be predictive. The truth architecture must necessarily be leveraged to provide the truly predictive metrics such as the futuristic Cost per Case or Weeks of Supply – one must necessarily be able to obtain the metrics at the right aggregation level. When you look at the existing metrics that are being used, you will often find that the metrics are computed at the wrong aggregation level.

Identifying leading indicators and that too at the right aggregation level is a maturity that requires availability of historical information
in order to perform basic linear regression or years of experience to set the empirical rules. Most organizations have reporting and architecture designs that make them look at lagging metrics.

In a nutshell the fundamental process of designing analytic solutions is not very different across industries. The analytic needs of various industries are different, but the metrics and the data architecture used are very similar. It requires a holistic view, taking into account the core interdependent business processes, having interdisciplinary whiteboard discussions to map out the processes, analytic needs, data model and data sources. Aided with the appropriate data integration model, this could help you reap benefits of the huge array of data that the information systems within your organization generate.

Get In Touch

Hidden
Name(Required)