Real-time data has become the norm, said Ben Gamble from Ververica
Processing data in real-time used to be the preserve of mission-critical applications because of the associated cost and lack of accessibility, but this is no longer the case, Ben Gamble, Field CTO of Ververica told IoT Insider in an exclusive interview, as the democratisation of technology can be credited to new and interesting use cases.
Thanks to the evolution of data processing, a wide variety of applications has opened up. Take ordering food for example. Where in the past a person would ring up a restaurant and place an order, the rise of food delivery apps means this person can now place their order, be notified of when that order is ready, and track their delivery driver’s route in real-time.
“Nobody’s going to accept anything being slower anymore,” said Gamble, neatly encapsulating how the digitalisation of our world has shaped consumer expectations.

Ververica specialises in stream processing technology, which refers to viewing a sequence of events in real time. Their unified streaming data platform, Apache Flink, was created because of the question, “what happens when data gets big?”
Apache Flink was donated to Apache Foundation as an open-source project and the service Ververica offers is its commercialised version.
“The first rule is that if you want to act on data which is big, you often need more than one machine, but you want it to be resilient, you want it to be fast,” said Gamble. “The one thing you can safely say is that no one ever paid for anything to be slower.”
Some examples of where Apache Flink can be deployed spans a warning going off in an oil field, to access control – making sure when someone taps their card to get into a building, they’re allowed in instantly.
“What we do is offer a service that allows you to have hard guarantees over your data, while having the ability to act at scale and speed and with the flexibility over data streams.”
Stream processing technology
Stream processing technology is there to resolve the “curse of success” that technologies and systems experience when they become more popular and widely used, Gamble said.
“Let’s say you build a dash cam which alerts you when something’s going wrong,” he said. “That’s great. But when you have insurance companies describing to the feed behind the scenes … there are a million cars and 10 million messages a second … so you need to be able to process every single message once, but within a time frame that makes sense.”
This is where stream processing can benefit use cases such as these. The nuance is important: for Gamble, who relates his experience with working on water sensors in the past as an example of where “it didn’t matter if your data was a few weeks late, because you were looking for trends over a long period”. But for examples such as detecting flood events, “that’s not fast enough, you need to know within the minute”.
“Classically you took the data, you dumped it in a database, and you did some work on it,” explained Gamble. “But as we got to the point where we had the connectivity to talk to the device … Now that data is available so that you can extract the maximum value from it.”
Gamble said he saw the “democratisation” of tools such as Apache Flink as being responsible for opening up applications.
“I keep jumping between infrastructure examples and then scooters to show this stuff has become normalised,” he said. “That’s because of the democratisation of stream processing through technology like Flink.”
Examples of where stream processing can help
Outside of scooters, dashcams and flood detection, Gamble mentioned a couple of interesting examples of customers they work with where stream processing is playing a big part. One customer, Booking.com, leverages the software as an orchestration tool to enhance performance and reduce operational overhead within their security department. By integrating Ververica’s solution, Booking.com has significantly improved the scalability of its security-focused data workflows. Data processing speeds increased from 1GB per second to 20GB per second, with peak performance reaching up to 40GB per second.
Square, which belongs to the umbrella company Block that also owns Cash App, another customer, faces greater regulatory pressures but the expectation from the end user remains that if they tap their card to pay for something, that transaction will go through.
“User experience is nearly always the driving force here,” stressed Gamble. “If I tap my card to pay, even when the regulations get tighter, I do not expect it to be slower than it was last time.”
Gamble credited Covid as being a major trigger in causing the digitalisation of industries and processes.
“You might have heard the joke, what was the main driver of digital transformation? Your CIO, your customers, your board or Covid-19. [With Covid-19] everything needed to be available, remote. So that mandated that everything would go online and connect.”
Real time becoming the norm
“The big thing I see is a convergence towards real time as the norm,” said Gamble. “For a long time real time was impossible, so we didn’t do real time … but as we go further and further, we mechanise things, we computerise things, we had to slow down because the machine in this room couldn’t talk to the machine in that room. We’re going back to a better model.”
There’s plenty of other editorial on our sister site, Electronic Specifier! Or you can always join in the conversation by commenting below or visiting our LinkedIn page.