Applying predictive analytics to real-time data feeds should be done to not only make better business decisions faster, but also to optimise potential opportunities as they materialise, react quicker to possible threats and mitigate any potential risks within the broader business. However, implementing real-time advanced analytics requires a different approach and a different toolset.
In my previous posts on real-time BI, I covered the actions required to achieve the real-time value of Business Intelligence (BI) as highlighted in a recent white paper by the Aberdeen Group1. These posts addressed how achieving real-time BI is possible, training your analytical team, and the importance of quality data in getting real-time BI right. This post will focus on combining real-time data with predictive modelling applications, as a final piece in the puzzle of how real-time business benefit can be realised.
By implementing real-time predictive modelling applications, the business can now mitigate any risks “on the fly” and instantaneously avert future disasters. This assists decision-makers in making better decisions in much shorter timeframes, through which they can take their businesses from being reactive (which is not ideal any more) to being proactive. Being proactive will result in a more stable business model which is ultimately what the business strives for.
Real-time analytics is a modelling approach where the logistic regression models or neural networks are run in a massively parallel in-memory environment. By doing this, the samples can be larger, or sampling can be totally eliminated to run on the full population. Similarly, the number of variables that are tested can be expanded and the number of iterations can be increased exponentially.
This is useful for the following situations:
- High Volume & Speed: It is necessary to run many, many models quickly.
- High Width & Depth: It is desired to test hundreds or thousands of metrics across tens of millions customers (or other entities).
- High Complexity: It is critical to run processing-intensive algorithms on all this data and to allow for many iterations to occur.
More accurate models are generated in this manner since many more iterations to tune the results can be completed in much shorter timeframes. The models can be updated regularly to keep the scoring routines fresh instead of using existing scoring routines. A much higher volume of models can be generated in the same timeframe which allows models to be applied more broadly to new problems. Models can be built even on large and complex data in seconds or minutes, which allows real real-time model evaluation and application.
The approach and implementation outlined here enables organisations to benefit from a real-time analytical environment – to accelerate business visibility and the speed of decision-making very effectively. Taking into account all the different aspects that I have previously discussed, the real-time value of BI has not only become an enabler of business success, but by applying real-time predictive analytics it has also become a critical success factor of dynamic businesses.
- Business answers at your fingertips: The Real-Time value of BI, Aberdeen Group, February 2011