Timeflow is a “stream processing” or “complex event processing” engine. The platform allows us to listen to sequences of events and identify and respond to patterns of interest as they happen on the streams. An example of complex event processing in a business scenario might be “inform us when a high value customer places three orders in a 7 day time window and one of those is for a product with category Electronics”.
By design, Timeflow is tightly integrated with Apache Druid as it’s event store. We recently wrote this article describing the journey we went on with datastores for event processing, and explain why we chose Druid.
We currently use Druid in three ways:
One lens to look at Timeflow is therefore as an enhancement to Apache Druid which gives this stream processing capability. Most people wanting to do this will be looking at solutions such as Kafka Streams, Kinesis, Spark or Storm. These are great platforms, but organisations implementing them will find that integration is required between their stream processing and event store. By deeply integrating the two, we feel that we have an incredibly simple model which does not sacrifice scalability or latency.
We would be interested in learning more about how the community are combining event processing and Druid. Please do get in touch for an informal conversation about your experiences with this.