Cortana, Siri, Alexa: we encounter modern data analytics methods in our daily lives – on the PC, when using a cell phone or stereo system – that are far different from traditional reporting. The term data analytics stands for a great diversity of methods used to process data to answer crucial questions. This article provides an overview of the key methods based on their time perspective and possible applications.
Traditional BI – business intelligence – helps us evaluate company figures and thereby conduct a mostly retrospective analysis of business performance. Attributes and figures from the leading systems in the company are compiled and transformed to a comparable basis. KPIs – key performance indicators – help us understand the relationships between figures and attributes in relation to the course of time. Traditional BI projects draw data from one or more business systems such as Microsoft Dynamics AX or SAP and prepare them daily for analysis. Aside from business management expertise, the validation of data in the upstream systems is of particular importance here. That is why traditional BI projects in the course of an ERP implementation only gather speed shortly before the end of the project. The processed data are compiled in a data warehouse and the result is delivered in the form of multi-dimensional databases for OLAP (online analytical processing). We call this analysis method "managed BI".
Traditional BI projects in the course of an ERP implementation only really gather speed shortly before the end of the project!"
Managed BI remains a cornerstone of successful business management even though technological progress allows us to do far more. With new technologies such as in-memory or column-based indexing for example, data can be accelerated in the relational data model itself to the extent that they can be used for analytical reporting. Without these new technologies, the time between clicking in the report or an embedded filter and displaying the result could be used very well for other day-to-day business or a long lunch when data volumes are large. Yet the expectations of the report user for analytical visualization, regardless of the data source, are entirely different: the results should be available instantly or at least very quickly. Here we can assume an average of five seconds to display the result. A longer wait time than that will already make many users anxious.
"Five seconds to display the result is the goal – longer wait times make many users anxious."
The ability to use relational data for ad-hoc analyses has also driven another, ever growing reporting demand: Flexible, individual access to the steadily increasing data diversity and the numerous data sources that can be found in every company – also known as self-service BI in the field of analytics.
Extreme flexibility regarding data connections is a very important characteristic of self-service BI. From Excel to simple text files, relational or multidimensional databases, data services or even unstructured data such as social media posts – self-service BI wants access to everything without long wait times. Bringing data into the right context adds value for the company that goes beyond the traditional approach. We call self-service analyses on operational data from a variety of data sources "operational analytics". With both managed BI and operational analytics, we look at data retrospectively. Data that are current to the day are generally adequate in managed BI projects, while near-real-time data are needed for operational analytics.
In the meantime data can be found in every company that should receive special attention even at the time of data creation. This is the field of real-time data, generated for example by sensors of many different types. Applications range from heating systems reporting the current temperatures of water circuits to camera images that, extended with the necessary intelligence, can warn us of possible dangers. Beyond warnings, defined thresholds in a real-time monitor also allow us to launch automated processes that can be defined based on the data. The "Internet of Things" (IoT) communicates 24 hours a day, 365 days a year.
"Data can be found in every company that should receive special attention even at the time they are created!"
We call the analysis of real-time data "stream analytics". Over longer periods of time, sensors supply us with never before seen data volumes in structured but also unstructured form that we can only handle with the right big data technologies. The distribution of data, computing power, and summarization of partial results into an overall result are crucial factors here. Now that we have reached the point of real-time analytics, what does the future hold?
From operational data in business systems, Internet services and social networks to sensor data from Internet of Things applications: Genuine added value can be generated for us if we not only use the data for retrospective or current analysis but also learn from them in order to make predictions for the most probable future. This is where machine learning technologies come in. Models trained using collected data can be used for projections or to make classified statements on current data. A familiar example in the area of technical field service describes the process for proactive maintenance work.
"The BI future has already begun with predictive analytics"
We are looking to the future in the field of predictive analytics. But machine learning also enables us to take a further step. Collected data describe situations from the past, other correlating and subsequent data describe how we handled the situation and what results we were able to achieve with our reactions. Thus, the measured degree of success also allows conclusions to be drawn about the probability that a given reaction will lead to the greatest success. We call this analysis method "prescriptive analytics".
Let us examine the aforementioned prospective analysis methods in simplified form using a heating system as an example:
"The sensors in a heating system supply temperatures for water heating. When we identify a falling temperature over a predefined period of time, the past data collected in the temperature drop over time allow us to assume a rising probability for a heating system failure in the near future. Predictive analysis allows us to act proactively, we have to do maintenance work on the heating system. The maintenance tasks are then carried out and their results are evaluated, and the corresponding data are collected. In the future, the prescriptive analysis method for these collected data will identify the most successful proactive maintenance tasks, and then we can even automate these."
The analysis methods and technologies that are used make a major contribution to purposefully preparing data generated in various forms and at different times according to the functional requirements. But then the data need to be visualized so they are comprehensible for the recipient. Various studies and surveys in past years indicated that uniform visualization plays an ever-increasing role for end users due to the growing diversity of data. This is where Microsoft offers what is currently the most agile data visualization tool: Power BI. It visualizes data consistently regardless of the data source. Numerous data connectors are available to the user and a heterogeneous data landscape can be displayed in a uniform visualization, processed, and made available to other end users.
"Power Bi is the most agile tool for uniform data visualization"
Power BI dashboards combine data from many different business areas in a company, standardizing the view of traditional analyses, self-service BI with access to operational data, real-time monitoring, and prospective analyses in one interface. Thus, they provide extensive insight for comprehensive business management.
The data that are thereby available to us even today, summarized and correlated on one platform, require more far-reaching processing and visualization methods than the sole traditional business intelligence approach is able to deliver. In contrast to traditional business intelligence, the point in time when we have to deal with data in a company and the specific analysis options is also at the beginning of the digital transformation in business processes.
Thus, business intelligence integrates seamlessly with intelligent systems, providing comprehensive and integrated process support: BI becomes continuous data analytics. The various methods used in the field of data analytics overall – managed BI, operational analytics, stream analytics, predictive and prescriptive analytics – with their respective characteristics help us comprehend the various perspectives of time and problem space more quickly, identify their structures, and thus make the best possible decisions.