As more data is generated from devices, sensors and “things,” more of it will need to be analysed – as it is still streaming.
Event stream processing involves quickly analysing time-based data as it is being created and before it’s stored, even at the instant that it is streaming from one device to another.
“Traditional analytics applies processing after the data is stored, but for an increasing number of situations, these insights are too late,” SAS global product marketing manager Fiona McNeill said.
“Directly working with event data, when they happen, allows for faster reaction time – even influencing a situation before it’s over.”
Cost of IoT deployments is also driving the uptake of event stream processing, according to SAS Australia and New Zealand’s chief analytics officer Evan Stubbs.
“Smart devices are fundamentally expensive to make,” Stubbs told IoT Hub.
“If you look at a bill of materials, to make devices smart you have to use good chips, and if you want the intelligence on the device itself, then you’ve got to add in the cost of batteries for additional power.
“The beauty of the Internet of Things is that if you can centralise the intelligence somewhere else, you can reduce the cost of these devices. You can still have fairly intelligent devices in the field, it’s just the intelligence is being centrally located.”
Event stream processing is already considered important in applications as diverse as smart grid stabilisation, predictive asset maintenance and digital marketing.
Stubbs believed in A/NZ that agriculture could be a significant beneficiary of event stream processing.
According to McNeill, event stream processing can occur in three distinct places: at the edge of the network, in the stream, or on data that’s at rest, out of the stream.
Edge analytics
At-the-edge analytics is any data that is processed on the same device from which it is streaming. This device could be a thermostat, iPhone or any single sensor with processing capabilities.
This type of analytics works with minimal context to the data, often confined to rudimentary rules and simple statistics like average or standard deviation.
Simple commands can be automated using analytics at the edge, as such instructions to turn something on/off or to stop/go. For example, a thermostat adjusts based on temperature fluctuations.
In-stream
In-stream analytics occur as data streams from one device to another, or from multiple sensors to an aggregation point.
This type of analysis combines events of different types and alternate formats that are transmitting at varied rates.
Analytics on multiple stream inputs has a richer context and can be used to identify more complex patterns of interest, or even connect a desired chain of actions.
You can also use in-stream analytics to automate or trigger more sophisticated prescriptive actions. For example, analysing mobile phone use relative to subscribed plan offers can be triggered based on location and activity.
At-rest
At-rest analytics occurs when there is a historical repository of data, which can include both data saved from event streams as well as other stored information – so it’s processed after the fact.
With the big data volumes that streaming generates, high-performance analytics is needed for effective processing.
And time can be saved by cleansing and normalising data while it’s in motion – before it’s stored, even in large data lakes like Hadoop.
At-rest analytics is based on rich, historical context – the perspective required to create predictive analytical models and forecasts and to discover new patterns of interest.
ESP in action
More advanced organisations using event stream processing will deploy all three tactics in a multiphase analytics system, said McNeill, optimising the decisions that should be made in each step of the process.
“Multiphase analytics can analyse data throughout the event spectrum to inform what sensors are needed where and when, what new patterns of interest are emerging – and to provide continuous interactive monitoring for situational awareness in centralised command centres.”
Stubbs said SAS is focusing its efforts on ways to embed intelligence into distributed fabrics.
“We’re building the technologies to be able to take massive data streams and to apply intelligence to those streams as they’re being created,” he said.
This article includes content from SAS Insights adapted with permission.