Everything in the digital economy is concerned with data. Businesses become increasingly smarter, collecting and analyzing data in a structured way, so they can make high-quality, fact-based decisions fast. The increasing importance and use of data has led to new developments that are going to have a big impact on Business Intelligence. In practical terms, the Passionned Group predicts the following six Business Intelligence trends for 2015.
1. Digital Ethics & Privacy
Personal data is sacred in Europe. We expect other continents and countries to follow. The Data Protection Act means the privacy of its citizens data must be guaranteed. The implementing bodies, the Data Protection Authorities, define ‘personal data’ as any data that can be directly traced back to a natural person. This also applies to encrypted or aggregated data.
The new digital economy demands a new ethic for Business Intelligence and Analytics. How can data be obtained? What is data used for? A number of large data owners are known to have hardly any reservations at all when it comes to the commercialization of data. That cannot be allowed to continue.
For a long time, the Authorities have been something of a paper tiger that was limited to making recommendations and initiating investigations after the damage had been done. Recently however, some Authorities have been given the power to impose fines.
As consumer researchers have discovered, there is a general trend for the younger generation to be less concerned about privacy, and exchange parts of their privacy for discounts and attractive offers. Privacy is becoming, as it were, a new currency.
One concrete step in the field of privacy protection is the phenomenon data masking, a new functionality in the professional ETL toolkit that makes it possible to analyze anonymized data. If ‘suspicious’ patterns occur, an identity can only be revealed after formal authorization.
2. Internet of Things
Increasingly, more measuring instruments, cameras, and other methods of recording are located in public spaces, at home or at specific business locations. More and more new products and services are based on the data generated by devices (sensors, meters, cameras). Think of smart meters, pumps with sensors three kilometers below sea level, and even diapers that generate data. These devices all send their data to a platform fitted with analytical software: for some of them, that will involve around several kilobytes once a day, while for others, like aircraft engines, it will involve a real-time stream exceeding terabytes per hour.
All of this (real-time) data is tested, qualified, and used to make decisions via a qualified model. There are thousands of examples of such applications in almost every sector, and the number is growing by the day. This results in an enormous flow of data that needs to be moved and analyzed. In addition to which, use of the internet and mobile (4G) telephone networks is increasing too.
3. Data Storytelling
Information only begins to come to life if a story is told with it. We‘re all familiar with the development of “Human Interest” stories in journalism. News and facts are hung together on human experiences and emotions.
Some people refer to data storytelling as the next step in the development of data visualization. Understandably, because on the basis of visualized data, the user gains more and more insight into the story behind the data. Some software vendors now already supply the capacity to apply a story line to a data visualization. It’s an interesting development.
It isn’t yet completely clear quite where storytelling should be put on the menu with discovery, exploration, visualization, and data presentation. It’s also debatable whether existing Business Intelligence tools are able to improve communication about data. And isn’t it actually more about story-finding? This may appear to be a semantic issue, but that’s far from the case. Naturally, ‘Storytelling’ sounds OK, but it’s about discovering the story behind the data, not so much about recounting it.
4. Shortage of data scientists
Bringing different kinds and types of data together creatively, obtaining previously unimagined insights and translating them into practical issues remains the work of human beings. The exponential growth and availability of data, coupled with the ever-growing need for analytics, ensures that the existing shortage of data scientists will continue to rise. A good data scientist understands statistics, data blending and data visualization.
5. Data lakes and ecosystems
An enormous amount of data is becoming available, if only by virtue of the rise of the Internet of Things and open data. More and more businesses are storing data in its original form, in large but accessible and inexpensive storage pools (Hadoop, NoSQL). Due to the fact that the data destination isn’t yet known, the data is stored with all its attributes, in contrast to data marts, for example. Organizations can therefore go and fish in the data lake whenever they want to.
Over time, this method of data storage becomes its own ecosystem: the data lake heats up, as it were, data evaporates by deletion, ‘clouds’ arise when data sets are separated, new data comes in as ‘rain showers’, and there are several ‘rivers’ that bring a steady stream of data to the lake.
You need to be careful that the lake doesn’t turn into a swamp because of the lack of uniform definitions and performance problems. For this reason, smart data management is also a necessity.
6. Big Data bites off two letters from ETL
In the extension of data lakes, the familiar Extract-Transform-Load-process (ETL) for loading data warehouses and making BI possible takes place in fewer and fewer cases. Big Data is simply too big to be loaded (L) and transformed (T). Increasingly, data streaming is introduced, but this is often only prompted when it is clear which report or analysis is requested at that time.
The analytical platform transforms the various data sets on the fly into a single data model, for example, Hadoop, in conjunction with the BI solution. To get a decent response time, as much as possible is performed in-memory. So Big Data actually bites two letters from ETL – the letters T and L – which immediately raises the question: ‘Is there still a future for ETL?’
Take a look at our completely updated ETL Tools & Data Integration Survey.