In recent weeks, we’ve released a number of videos demonstrating Timeflow, our software as a service platform for real time event stream processing.
The first two videos show how we can use Timeflow to capture real time business intelligence and operational metrics, based on streaming event data from multiple systems.
In these videos we use small volumes of data and basic counters on simple business scenarios, but this obviously scales up to more complex statistical analysis using high volume data. This will be demonstrated in future videos.
Doing this type of work is feasible with traditional business intelligence arrangements such as PowerBI on SQL Server, but most companies still suffer with slow time to insight, clunky ETL type business intelligence platforms, and a focus on reporting on the past rather than operational and future looking reporting.
Traditional business intelligence systems will also struggle to scale with high volume event/time oriented data. Therefore, there is a degree of value in using Timeflow for this type of “real time business intelligence”.
Things get more interesting in the next video, when we detect situations of interest in real time based on data streaming from multiple systems, and then instead of simply dashboarding the metric, we are informed of the situation in real time via an API.
Though we can send emails or make API calls from directly within Timeflow, we also show how an external process can be informed of the situation via a push API based on Kafka, which gives the business lots of possibilities for automation. For instance, in the video below, we show that we have detected high value customers who are frequently complaining, which we could respond to be scheduling a call automatically in our call center to aid resolution.
We are excited by the potential here and would welcome conversations with people who have an interest in this field or a knowledge of more traditional business intelligence and analytics.